Dec 05 20:42:08 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 20:42:08 crc restorecon[4746]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:08 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:42:09 crc restorecon[4746]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 20:42:09 crc kubenswrapper[4747]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.653276 4747 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656389 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656412 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656419 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656424 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656431 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656438 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656445 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656453 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656459 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656465 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656470 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656476 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656481 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656491 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656497 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656502 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656507 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656512 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656518 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656523 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656528 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656533 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656539 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656545 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656552 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656558 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656563 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656569 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656574 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656597 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656603 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656608 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656614 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656619 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656624 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656629 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656634 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656640 4747 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656646 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656651 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656656 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656662 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656667 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656672 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656680 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656687 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656693 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656699 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656704 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656710 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656715 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656721 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656726 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656731 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656738 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656744 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656750 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656755 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656760 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656765 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656770 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656777 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656782 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656787 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656792 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656799 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656804 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656809 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656814 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656819 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.656824 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657198 4747 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657212 4747 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657223 4747 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657234 4747 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657242 4747 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657249 4747 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657257 4747 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657264 4747 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657271 4747 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657277 4747 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657284 4747 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657290 4747 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657296 4747 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657303 4747 flags.go:64] FLAG: --cgroup-root="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657309 4747 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657315 4747 flags.go:64] FLAG: --client-ca-file="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657321 4747 flags.go:64] FLAG: --cloud-config="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657327 4747 flags.go:64] FLAG: --cloud-provider="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657332 4747 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657339 4747 flags.go:64] FLAG: --cluster-domain="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657345 4747 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657351 4747 flags.go:64] FLAG: --config-dir="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657357 4747 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657364 4747 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657371 4747 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657378 4747 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657383 4747 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657389 4747 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657395 4747 flags.go:64] FLAG: --contention-profiling="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657402 4747 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657408 4747 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657415 4747 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657421 4747 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657428 4747 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657435 4747 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657441 4747 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657447 4747 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657452 4747 flags.go:64] FLAG: --enable-server="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657458 4747 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657466 4747 flags.go:64] FLAG: --event-burst="100" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657473 4747 flags.go:64] FLAG: --event-qps="50" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657479 4747 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657484 4747 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657490 4747 flags.go:64] FLAG: --eviction-hard="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657497 4747 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657504 4747 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657509 4747 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657516 4747 flags.go:64] FLAG: --eviction-soft="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657522 4747 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657527 4747 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657534 4747 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657540 4747 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657546 4747 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657552 4747 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657558 4747 flags.go:64] FLAG: --feature-gates="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657565 4747 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657571 4747 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657577 4747 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657599 4747 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657605 4747 flags.go:64] FLAG: --healthz-port="10248" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657612 4747 flags.go:64] FLAG: --help="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657618 4747 flags.go:64] FLAG: --hostname-override="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657624 4747 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657631 4747 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657637 4747 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657643 4747 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657649 4747 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657655 4747 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657661 4747 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657667 4747 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657673 4747 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657679 4747 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657685 4747 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657691 4747 flags.go:64] FLAG: --kube-reserved="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657697 4747 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657702 4747 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657709 4747 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657714 4747 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657720 4747 flags.go:64] FLAG: --lock-file="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657726 4747 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657732 4747 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657738 4747 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657747 4747 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657753 4747 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657759 4747 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657764 4747 flags.go:64] FLAG: --logging-format="text" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657771 4747 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657777 4747 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657783 4747 flags.go:64] FLAG: --manifest-url="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657789 4747 flags.go:64] FLAG: --manifest-url-header="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657797 4747 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657803 4747 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657811 4747 flags.go:64] FLAG: --max-pods="110" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657817 4747 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657823 4747 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657833 4747 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657839 4747 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657845 4747 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657851 4747 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657857 4747 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657870 4747 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657876 4747 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657882 4747 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657889 4747 flags.go:64] FLAG: --pod-cidr="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657895 4747 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657903 4747 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657909 4747 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657915 4747 flags.go:64] FLAG: --pods-per-core="0" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657922 4747 flags.go:64] FLAG: --port="10250" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657928 4747 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657934 4747 flags.go:64] FLAG: --provider-id="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657939 4747 flags.go:64] FLAG: --qos-reserved="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657945 4747 flags.go:64] FLAG: --read-only-port="10255" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657951 4747 flags.go:64] FLAG: --register-node="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657957 4747 flags.go:64] FLAG: --register-schedulable="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657963 4747 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657992 4747 flags.go:64] FLAG: --registry-burst="10" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.657998 4747 flags.go:64] FLAG: --registry-qps="5" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658004 4747 flags.go:64] FLAG: --reserved-cpus="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658009 4747 flags.go:64] FLAG: --reserved-memory="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658017 4747 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658023 4747 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658030 4747 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658036 4747 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658042 4747 flags.go:64] FLAG: --runonce="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658048 4747 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658054 4747 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658060 4747 flags.go:64] FLAG: --seccomp-default="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658068 4747 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658074 4747 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658080 4747 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658086 4747 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658092 4747 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658098 4747 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658104 4747 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658110 4747 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658116 4747 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658122 4747 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658128 4747 flags.go:64] FLAG: --system-cgroups="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658134 4747 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658143 4747 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658148 4747 flags.go:64] FLAG: --tls-cert-file="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658154 4747 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658162 4747 flags.go:64] FLAG: --tls-min-version="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658168 4747 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658174 4747 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658179 4747 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658185 4747 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658191 4747 flags.go:64] FLAG: --v="2" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658199 4747 flags.go:64] FLAG: --version="false" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658207 4747 flags.go:64] FLAG: --vmodule="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658214 4747 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658221 4747 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658381 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658388 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658394 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658399 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658405 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658413 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658419 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658427 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658433 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658438 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658443 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658448 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658454 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658459 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658464 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658469 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658474 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658480 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658485 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658490 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658495 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658500 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658506 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658511 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658516 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658521 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658526 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658532 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658537 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658542 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658547 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658552 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658558 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658563 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658568 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658573 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658578 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658601 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658606 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658614 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658619 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658625 4747 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658631 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658637 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658642 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658648 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658653 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658659 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658666 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658673 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658680 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658687 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658694 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658700 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658707 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658713 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658718 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658723 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658729 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658734 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658739 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658744 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658749 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658754 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658759 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658764 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658769 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658775 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658780 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658785 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.658792 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.658809 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.672393 4747 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.672482 4747 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672652 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672675 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672686 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672697 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672707 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672718 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672727 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672736 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672745 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672754 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672763 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672772 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672781 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672792 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672801 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672809 4747 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672817 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672826 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672834 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672844 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672853 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672861 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672870 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672878 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672887 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672895 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672903 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672912 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672921 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672930 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672938 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672946 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672956 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672966 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672977 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672988 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.672999 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673010 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673020 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673030 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673044 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673061 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673075 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673085 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673097 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673110 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673119 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673128 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673137 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673146 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673155 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673167 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673179 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673189 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673199 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673208 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673217 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673227 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673237 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673248 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673257 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673266 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673275 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673283 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673292 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673301 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673311 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673319 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673329 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673338 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673348 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.673363 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673659 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673677 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673686 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673696 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673705 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673713 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673722 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673730 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673740 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673748 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673757 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673765 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673773 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673782 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673792 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673804 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673817 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673828 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673839 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673850 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673860 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673869 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673877 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673886 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673897 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673906 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673916 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673925 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673933 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673942 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673950 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673958 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.673966 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674015 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674023 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674032 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674040 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674048 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674057 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674065 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674074 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674083 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674092 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674100 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674108 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674117 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674126 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674134 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674142 4747 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674151 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674159 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674168 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674176 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674185 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674193 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674203 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674215 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674224 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674233 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674242 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674251 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674262 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674271 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674281 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674290 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674300 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674310 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674320 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674329 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674338 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.674348 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.674362 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.674907 4747 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.679797 4747 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.679983 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.681032 4747 server.go:997] "Starting client certificate rotation" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.681088 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.681388 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-14 14:56:37.271698412 +0000 UTC Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.681556 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 210h14m27.590148971s for next certificate rotation Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.688709 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.692443 4747 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.702472 4747 log.go:25] "Validated CRI v1 runtime API" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.727607 4747 log.go:25] "Validated CRI v1 image API" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.729898 4747 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.733009 4747 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-20-37-38-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.733057 4747 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.765025 4747 manager.go:217] Machine: {Timestamp:2025-12-05 20:42:09.762779042 +0000 UTC m=+0.230086600 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:78337cb5-96e9-4698-b089-53cf0e34a059 BootID:56302b96-a896-482d-b012-96ecc8d111e7 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f5:53:00 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f5:53:00 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9f:cb:1e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e2:61:e3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d7:a8:d1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:86:5e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a8:b4:8a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:6a:3d:90:e2:bb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:b3:25:d6:53:a1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.765524 4747 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.765934 4747 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.766537 4747 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.766790 4747 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.766840 4747 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767077 4747 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767089 4747 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767356 4747 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767392 4747 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767774 4747 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.767880 4747 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.768619 4747 kubelet.go:418] "Attempting to sync node with API server" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.768639 4747 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.768670 4747 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.768684 4747 kubelet.go:324] "Adding apiserver pod source" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.768697 4747 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.770538 4747 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.771167 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.771223 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.771306 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.771247 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.771400 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.772904 4747 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.773917 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774130 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774258 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774401 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774528 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774678 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774796 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.774943 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.775071 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.775195 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.775311 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.775423 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.775873 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.776505 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.776873 4747 server.go:1280] "Started kubelet" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.777332 4747 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.777321 4747 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.778326 4747 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.778983 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.779060 4747 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 20:42:09 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.779281 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 22:35:39.508405796 +0000 UTC Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.779361 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 769h53m29.729051189s for next certificate rotation Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.779794 4747 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.780129 4747 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.779659 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e6c6f6b84688b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:42:09.776822411 +0000 UTC m=+0.244129969,LastTimestamp:2025-12-05 20:42:09.776822411 +0000 UTC m=+0.244129969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.780177 4747 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.780053 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.780423 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.780476 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.780932 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781234 4747 factory.go:55] Registering systemd factory Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781255 4747 factory.go:221] Registration of the systemd container factory successfully Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781490 4747 factory.go:153] Registering CRI-O factory Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781517 4747 factory.go:221] Registration of the crio container factory successfully Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781630 4747 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781656 4747 factory.go:103] Registering Raw factory Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.781673 4747 manager.go:1196] Started watching for new ooms in manager Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.782445 4747 manager.go:319] Starting recovery of all containers Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.784128 4747 server.go:460] "Adding debug handlers to kubelet server" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.795119 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.795558 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.795801 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.795987 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.796167 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.796345 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.796520 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.796774 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.796961 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.797126 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.797309 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.797486 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.797698 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.797885 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.798090 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.798334 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.798518 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.798748 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.798926 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799096 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799279 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799451 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799654 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799831 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.799998 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.800185 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.800365 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.800538 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.801757 4747 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.802065 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.802260 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.802446 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.802661 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.802866 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.803299 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.803476 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.803705 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.803906 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.804070 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.804231 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.804394 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.804655 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.804844 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805009 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805173 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805339 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805500 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805747 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.805935 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.806141 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.806327 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.806497 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.806723 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.806919 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.807104 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.807298 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.807484 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.807717 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.807911 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.808083 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.808252 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.808416 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.808631 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.808823 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809014 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809187 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809358 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809523 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809782 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809963 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.810221 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.810423 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.810641 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.810863 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.811054 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.811257 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.811739 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.811932 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.812157 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.812358 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.812543 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.812761 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.812947 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.813134 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.813501 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.809787 4747 manager.go:324] Recovery completed Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.813852 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814068 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814245 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814425 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814627 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814809 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.814982 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.815142 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.815337 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.815515 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.815756 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.815933 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.816091 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.816261 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.816465 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.816681 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.816871 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.817052 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.817218 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.817401 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.817676 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.817885 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.818081 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.818249 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.818440 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.818656 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.818832 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819016 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819175 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819327 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819486 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819705 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.819879 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.820028 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.820176 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.820658 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.820843 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.820991 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.821137 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.821285 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.821451 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822064 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822274 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822437 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822635 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822805 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.822979 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.823145 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.823475 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.823659 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.823818 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.823987 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824136 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824286 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824430 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824618 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824800 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.824956 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825132 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825279 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825427 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825572 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825830 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.825983 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.826183 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.826331 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.826540 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.826752 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.826899 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.827038 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.827136 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.827212 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.827296 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828424 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828525 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828629 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828718 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828807 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828886 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.828963 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.829036 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.829109 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.829183 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830640 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830697 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830726 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830744 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830767 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830787 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830803 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.830871 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.831927 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.831993 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832011 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832031 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832044 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832061 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832074 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832089 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832108 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832123 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832140 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832156 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832169 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832186 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832199 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832224 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832237 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832249 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832265 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832278 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832296 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832310 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832321 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832337 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832351 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832367 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832378 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832391 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832407 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832418 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832433 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832444 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832458 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832470 4747 reconstruct.go:97] "Volume reconstruction finished" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.832482 4747 reconciler.go:26] "Reconciler: start to sync state" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.833136 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.834080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.834113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.834124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.835437 4747 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.835452 4747 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.835472 4747 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.838362 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.838415 4747 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.838446 4747 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.838498 4747 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 20:42:09 crc kubenswrapper[4747]: W1205 20:42:09.839115 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.839193 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.844963 4747 policy_none.go:49] "None policy: Start" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.845665 4747 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.845768 4747 state_mem.go:35] "Initializing new in-memory state store" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.880653 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.918566 4747 manager.go:334] "Starting Device Plugin manager" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.918731 4747 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.918749 4747 server.go:79] "Starting device plugin registration server" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.919142 4747 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.919159 4747 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.919316 4747 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.919519 4747 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.919544 4747 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.929087 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.939124 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.939212 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940704 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.940939 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.941754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.941807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.941822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942073 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942216 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.942270 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.943338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.943365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.943374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944722 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.944933 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.945974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946192 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946330 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946379 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946925 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.946961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.947100 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.947130 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.947674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.947705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.947716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.948145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.948181 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:09 crc kubenswrapper[4747]: I1205 20:42:09.948198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:09 crc kubenswrapper[4747]: E1205 20:42:09.982338 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.019709 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.021158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.021281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.021303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.021332 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: E1205 20:42:10.021902 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036683 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036747 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036822 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036854 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036918 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036950 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.036982 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037038 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037104 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037155 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037230 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037261 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037293 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.037325 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138311 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138385 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138399 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138431 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138461 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138478 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138513 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138497 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138481 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138602 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138563 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138451 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138509 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138652 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138697 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138850 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.138878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.223026 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.225699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.225777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.225796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.225836 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: E1205 20:42:10.226624 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.280256 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.287339 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.304753 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.320303 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3a3f32f037ab9fe64f98e89c1a359c806a72b8711f8321da91a3a8d93cd4fd70 WatchSource:0}: Error finding container 3a3f32f037ab9fe64f98e89c1a359c806a72b8711f8321da91a3a8d93cd4fd70: Status 404 returned error can't find the container with id 3a3f32f037ab9fe64f98e89c1a359c806a72b8711f8321da91a3a8d93cd4fd70 Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.321104 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-cff2d3f10b6aeeec334f65aaf37ae0956638ace0ec321e3cd86c98218d17fa8c WatchSource:0}: Error finding container cff2d3f10b6aeeec334f65aaf37ae0956638ace0ec321e3cd86c98218d17fa8c: Status 404 returned error can't find the container with id cff2d3f10b6aeeec334f65aaf37ae0956638ace0ec321e3cd86c98218d17fa8c Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.324851 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f6ce1b131b4177af0246b32f18c91a49d9fcd36cf2072904d5a0e56487649322 WatchSource:0}: Error finding container f6ce1b131b4177af0246b32f18c91a49d9fcd36cf2072904d5a0e56487649322: Status 404 returned error can't find the container with id f6ce1b131b4177af0246b32f18c91a49d9fcd36cf2072904d5a0e56487649322 Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.326010 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.328968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.346263 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-fa23863172598c94de96fcb802b10afab7f30a57b5a1237dceb93cc4dbf42176 WatchSource:0}: Error finding container fa23863172598c94de96fcb802b10afab7f30a57b5a1237dceb93cc4dbf42176: Status 404 returned error can't find the container with id fa23863172598c94de96fcb802b10afab7f30a57b5a1237dceb93cc4dbf42176 Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.356327 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7a1891b83b33452c508f1674c0fb88a73f39745cce3572539df95ac5de77a7fc WatchSource:0}: Error finding container 7a1891b83b33452c508f1674c0fb88a73f39745cce3572539df95ac5de77a7fc: Status 404 returned error can't find the container with id 7a1891b83b33452c508f1674c0fb88a73f39745cce3572539df95ac5de77a7fc Dec 05 20:42:10 crc kubenswrapper[4747]: E1205 20:42:10.383806 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Dec 05 20:42:10 crc kubenswrapper[4747]: W1205 20:42:10.591516 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:10 crc kubenswrapper[4747]: E1205 20:42:10.591623 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.627011 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.628229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.628270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.628280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.628302 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: E1205 20:42:10.628699 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.777288 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.844482 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.844550 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.844644 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6ce1b131b4177af0246b32f18c91a49d9fcd36cf2072904d5a0e56487649322"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.844725 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.845345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.845385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.845404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.846868 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.846909 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.846986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cff2d3f10b6aeeec334f65aaf37ae0956638ace0ec321e3cd86c98218d17fa8c"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.847169 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.847732 4747 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8dd68f5ac515189a8807e13bbe6d5ed8c22df602d0f3ee6dc29ede8c8fbc94d0" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.847763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8dd68f5ac515189a8807e13bbe6d5ed8c22df602d0f3ee6dc29ede8c8fbc94d0"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.847813 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3a3f32f037ab9fe64f98e89c1a359c806a72b8711f8321da91a3a8d93cd4fd70"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.847876 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848081 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.848642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.849272 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.849342 4747 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.849410 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.849436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7a1891b83b33452c508f1674c0fb88a73f39745cce3572539df95ac5de77a7fc"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.849810 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.851988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.852330 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54"} Dec 05 20:42:10 crc kubenswrapper[4747]: I1205 20:42:10.852386 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa23863172598c94de96fcb802b10afab7f30a57b5a1237dceb93cc4dbf42176"} Dec 05 20:42:11 crc kubenswrapper[4747]: E1205 20:42:11.184972 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Dec 05 20:42:11 crc kubenswrapper[4747]: W1205 20:42:11.201900 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:11 crc kubenswrapper[4747]: E1205 20:42:11.202005 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:11 crc kubenswrapper[4747]: W1205 20:42:11.306724 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Dec 05 20:42:11 crc kubenswrapper[4747]: E1205 20:42:11.306803 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.429008 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.430626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.430674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.430687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.430715 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.859271 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"23ff3fafc5deca1ed8a383606d014405f2f2b6338097338f0c3b64a9244472d8"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.859413 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.863344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.863389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.863401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.866774 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.866766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.866846 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.866864 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.867716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.867757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.867768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.871150 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.871198 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.871214 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.871743 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.873420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.873454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.873465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874749 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874826 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.874931 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.875848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.876035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.876163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.876744 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557" exitCode=0 Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.876788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557"} Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.876910 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.877494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.877530 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:11 crc kubenswrapper[4747]: I1205 20:42:11.877544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.441414 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.598462 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.654924 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882721 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3" exitCode=0 Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3"} Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882947 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882966 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882981 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.882966 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.883088 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.883060 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.884963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.885951 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.886534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.886567 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:12 crc kubenswrapper[4747]: I1205 20:42:12.886609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.227843 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.620139 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891134 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f"} Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77"} Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df"} Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891202 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37"} Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891220 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891221 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.891221 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892738 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.892761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.893206 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.893229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:13 crc kubenswrapper[4747]: I1205 20:42:13.893238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.900215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393"} Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.900285 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.900369 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:14 crc kubenswrapper[4747]: I1205 20:42:14.902851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:15 crc kubenswrapper[4747]: I1205 20:42:15.902975 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:15 crc kubenswrapper[4747]: I1205 20:42:15.904515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:15 crc kubenswrapper[4747]: I1205 20:42:15.904609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:15 crc kubenswrapper[4747]: I1205 20:42:15.904636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.630786 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.631143 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.633456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.633508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.633526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.659923 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.660235 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.662039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.662092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:17 crc kubenswrapper[4747]: I1205 20:42:17.662114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:18 crc kubenswrapper[4747]: I1205 20:42:18.024354 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 20:42:18 crc kubenswrapper[4747]: I1205 20:42:18.024718 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:18 crc kubenswrapper[4747]: I1205 20:42:18.026740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:18 crc kubenswrapper[4747]: I1205 20:42:18.026811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:18 crc kubenswrapper[4747]: I1205 20:42:18.026827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:19 crc kubenswrapper[4747]: E1205 20:42:19.929358 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:42:19 crc kubenswrapper[4747]: I1205 20:42:19.979544 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:19 crc kubenswrapper[4747]: I1205 20:42:19.979859 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:19 crc kubenswrapper[4747]: I1205 20:42:19.981474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:19 crc kubenswrapper[4747]: I1205 20:42:19.981523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:19 crc kubenswrapper[4747]: I1205 20:42:19.981541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:21 crc kubenswrapper[4747]: W1205 20:42:21.365495 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.365662 4747 trace.go:236] Trace[45106387]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:42:11.362) (total time: 10003ms): Dec 05 20:42:21 crc kubenswrapper[4747]: Trace[45106387]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (20:42:21.365) Dec 05 20:42:21 crc kubenswrapper[4747]: Trace[45106387]: [10.003112531s] [10.003112531s] END Dec 05 20:42:21 crc kubenswrapper[4747]: E1205 20:42:21.365696 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.420773 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.421030 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.422512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.422567 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.422643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.428790 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:21 crc kubenswrapper[4747]: E1205 20:42:21.431826 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.777297 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.919900 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.923268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.923336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.923385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:21 crc kubenswrapper[4747]: I1205 20:42:21.926019 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.408160 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.408248 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.416085 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.416148 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.599293 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.599366 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.922853 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.924205 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.924251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.924262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.980678 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:42:22 crc kubenswrapper[4747]: I1205 20:42:22.980799 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.032278 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.034301 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.034360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.034380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.034422 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.624455 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.624649 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.624932 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.624970 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.625605 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.625649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.625664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.628815 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.928542 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.929656 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.929754 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.930178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.930237 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:23 crc kubenswrapper[4747]: I1205 20:42:23.930260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.407521 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.409991 4747 trace.go:236] Trace[1660342121]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:42:14.297) (total time: 13112ms): Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1660342121]: ---"Objects listed" error: 13112ms (20:42:27.409) Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1660342121]: [13.112904095s] [13.112904095s] END Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.410023 4747 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.411136 4747 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.414163 4747 trace.go:236] Trace[1005059984]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:42:13.493) (total time: 13920ms): Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1005059984]: ---"Objects listed" error: 13920ms (20:42:27.414) Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1005059984]: [13.92016247s] [13.92016247s] END Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.414191 4747 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.416150 4747 trace.go:236] Trace[1559833922]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:42:14.350) (total time: 13065ms): Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1559833922]: ---"Objects listed" error: 13065ms (20:42:27.415) Dec 05 20:42:27 crc kubenswrapper[4747]: Trace[1559833922]: [13.065366316s] [13.065366316s] END Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.416174 4747 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.780695 4747 apiserver.go:52] "Watching apiserver" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783167 4747 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783386 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-fql7t","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783762 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783888 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783904 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.783941 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.783962 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.784558 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.789360 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.789500 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.789723 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790247 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790285 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790287 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790384 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790432 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.790382 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.792087 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.794437 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.794509 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.794900 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.798433 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.803098 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.811265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.826529 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.839939 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.848554 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.857559 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.872260 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.881543 4747 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.886001 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.904220 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914418 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914484 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914540 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914617 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914642 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914689 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914715 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914735 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914776 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914800 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914849 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914870 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914895 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914918 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914968 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.914990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915012 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915030 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915050 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915070 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915089 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915112 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915155 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915177 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915201 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915223 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915264 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915286 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915306 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915327 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915375 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915397 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915417 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915460 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915482 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915545 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915563 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915598 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915643 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915663 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915712 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915736 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915760 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915789 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915811 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915806 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915832 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915925 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915958 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915960 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.915987 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916046 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916037 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916070 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916095 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916116 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916136 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916159 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916181 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916202 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916225 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916252 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916280 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916300 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916324 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916349 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916427 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916498 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916573 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916636 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916664 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916688 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916714 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916738 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916761 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916787 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916813 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916861 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916907 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916933 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916960 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916985 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917010 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917032 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917054 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917080 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917107 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917131 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917156 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917599 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917685 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917716 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917739 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917773 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917798 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917824 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917846 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917866 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917887 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918556 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918608 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918685 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918712 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918868 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918902 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918934 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918963 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919060 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919117 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919149 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919178 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919219 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919248 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919273 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919307 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919336 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919370 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919428 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919457 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919486 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919512 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919541 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919572 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919617 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919647 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919670 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919694 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919717 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919766 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919789 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919816 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919839 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919861 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920410 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920435 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920457 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920491 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920514 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920555 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920635 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920655 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920676 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920697 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920717 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922523 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916199 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916362 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916429 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916515 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916670 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916797 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916921 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.916990 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917083 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917296 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917390 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.917496 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918202 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918471 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918475 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918506 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918553 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918750 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918764 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.918822 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919063 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919287 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919305 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919346 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919491 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919942 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.919995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920047 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920317 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920768 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921144 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921171 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.920831 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921194 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921724 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921810 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.921845 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922017 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922149 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922174 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922336 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922510 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922733 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.922748 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923000 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923061 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923113 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923540 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923632 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923912 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923870 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923981 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924005 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924076 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.923383 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924193 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924276 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924592 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924690 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924706 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.924876 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.925045 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.925088 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926438 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926424 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926518 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926828 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926937 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927071 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927161 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927235 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927311 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927395 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927655 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927755 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927856 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927947 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928023 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928156 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928235 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928317 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928885 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928962 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929164 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929297 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929381 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929478 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930754 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930809 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930846 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930885 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930924 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-hosts-file\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931202 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rrf\" (UniqueName: \"kubernetes.io/projected/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-kube-api-access-v5rrf\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931243 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931383 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931429 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931466 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931497 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931529 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931562 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931647 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931730 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931753 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931769 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931787 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931805 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931853 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.926685 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927186 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927431 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927464 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927571 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.927883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928427 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928743 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.928885 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929308 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929896 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.929949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930335 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.930956 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931131 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931314 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931451 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931463 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931638 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931942 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.931963 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.932010 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.933604 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:28.43356123 +0000 UTC m=+18.900868718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933639 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933111 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933459 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933844 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.934018 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.934118 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.934294 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.934923 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.935187 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.935547 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.935969 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.936088 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.938870 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.939121 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.939260 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.939470 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.939925 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.940056 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.932383 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.940312 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.940322 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.940449 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.933997 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.940867 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.941003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.941101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.941380 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.941920 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.941942 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.942178 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.942227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.942480 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.943528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.943647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.943839 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.943874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944017 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.941491 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944009 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944128 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.942872 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944155 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944351 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944575 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.944859 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945008 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945307 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945433 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945807 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945840 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946041 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946074 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946208 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946251 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.946333 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.947590 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.947661 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945686 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.945505 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.947753 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:28.440461249 +0000 UTC m=+18.907768737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.947814 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:28.447795138 +0000 UTC m=+18.915102626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.943559 4747 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948138 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948153 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948437 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948468 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948689 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.949254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948704 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948895 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948906 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.948937 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.949080 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.949094 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.949760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.949833 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.950375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.950433 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.950552 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.950596 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.950955 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.951052 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.951537 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.952064 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.952285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.952938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.953067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.953904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.954862 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.955886 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.965809 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.966663 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.970251 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.970911 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.970948 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.970967 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.971038 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:28.471013211 +0000 UTC m=+18.938320909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.972096 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.973354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.973660 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.975398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.976126 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.976303 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36" exitCode=255 Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.976352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36"} Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.976553 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.980899 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.980923 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.980938 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:27 crc kubenswrapper[4747]: E1205 20:42:27.981053 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:28.481035488 +0000 UTC m=+18.948342976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.984144 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.984642 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.984955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.988052 4747 scope.go:117] "RemoveContainer" containerID="303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.988431 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.988951 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.990576 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:27 crc kubenswrapper[4747]: I1205 20:42:27.994869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.003016 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.014910 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.023206 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032629 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-hosts-file\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032654 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rrf\" (UniqueName: \"kubernetes.io/projected/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-kube-api-access-v5rrf\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032790 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032805 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032818 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032850 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032865 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032876 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032887 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032917 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032931 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032943 4747 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032955 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032967 4747 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.032995 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033009 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033021 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033031 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033039 4747 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033047 4747 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033056 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033081 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033089 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033097 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033107 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033116 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033124 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033134 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033158 4747 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033167 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033176 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033184 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033192 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033201 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033208 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033232 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033244 4747 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033253 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033262 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033272 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033281 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033290 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033315 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033324 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033333 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033340 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033349 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033359 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033368 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033391 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033399 4747 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033408 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033416 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033424 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033432 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033440 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033463 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033471 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033480 4747 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033489 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033498 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033507 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033520 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033542 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033550 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033538 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-hosts-file\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033755 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033562 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033790 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033803 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033815 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033827 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033840 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033851 4747 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033864 4747 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033876 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033889 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033900 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033911 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033921 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033932 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033944 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033955 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033981 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.033997 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034008 4747 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034018 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034031 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034042 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034052 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034063 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034074 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034084 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034095 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034106 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034119 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034144 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034158 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034170 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034180 4747 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034192 4747 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034203 4747 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034215 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034227 4747 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034239 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034251 4747 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034262 4747 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034274 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034285 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034297 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034309 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034322 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034333 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034344 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034355 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034368 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034380 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034393 4747 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034404 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034416 4747 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034426 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034437 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034464 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034476 4747 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034486 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034497 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034510 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034521 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034533 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034545 4747 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034557 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034573 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034607 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034618 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034629 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034639 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034650 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034662 4747 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034674 4747 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034701 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034713 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034727 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034742 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034754 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034765 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034777 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034789 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034801 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034813 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034825 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034837 4747 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034851 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034863 4747 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034875 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034887 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034898 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034909 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034921 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034935 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034946 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034957 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034968 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034978 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.034989 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035000 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035010 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035024 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035035 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035047 4747 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035072 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035084 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035094 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035105 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035117 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035127 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035140 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035153 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035165 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035181 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035197 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035208 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035218 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035229 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035240 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035251 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035262 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035274 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035285 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035296 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035307 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035319 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.035330 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.056886 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.057248 4747 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.057336 4747 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.063184 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.063978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.064010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.064022 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.064038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.064050 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.064233 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rrf\" (UniqueName: \"kubernetes.io/projected/3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0-kube-api-access-v5rrf\") pod \"node-resolver-fql7t\" (UID: \"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\") " pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.076877 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.083503 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.095398 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.098999 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.100618 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.102665 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.105010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.105045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.105055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.105071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.105083 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.106333 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.109659 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.115667 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.115917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.117436 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.121283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.121319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.121332 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.121349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.121364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.126862 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fql7t" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.133101 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.140682 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: W1205 20:42:28.140785 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5c1ba737fa1a23a63685d6d25de36d9e1200b879c17fed4359da375d3b0aa373 WatchSource:0}: Error finding container 5c1ba737fa1a23a63685d6d25de36d9e1200b879c17fed4359da375d3b0aa373: Status 404 returned error can't find the container with id 5c1ba737fa1a23a63685d6d25de36d9e1200b879c17fed4359da375d3b0aa373 Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.144917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.144942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.144967 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.144982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.144991 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.148202 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.155203 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.158520 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: W1205 20:42:28.159947 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3daf47d5_9d28_4d4a_bbca_ad6c4bb9f3d0.slice/crio-3cc76c76ab37f97c59e07912e8709d9f0067545bba38c41676c086cf97d0a417 WatchSource:0}: Error finding container 3cc76c76ab37f97c59e07912e8709d9f0067545bba38c41676c086cf97d0a417: Status 404 returned error can't find the container with id 3cc76c76ab37f97c59e07912e8709d9f0067545bba38c41676c086cf97d0a417 Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.160063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.160099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.160109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.160127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.160142 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.169777 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.174946 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.175169 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.178169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.178203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.178213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.178231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.178241 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.182558 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.193541 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.276598 4747 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.277358 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.280071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.280130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.280152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.280198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.280218 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.382610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.382649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.382660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.382676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.382688 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.438276 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.438396 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.438447 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:29.438433955 +0000 UTC m=+19.905741433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.484829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.484862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.484871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.484884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.484893 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.539331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.539426 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.539454 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.539479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539553 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539631 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:29.539615977 +0000 UTC m=+20.006923465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539645 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539674 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539686 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539739 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:29.539724379 +0000 UTC m=+20.007031867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539784 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539815 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539849 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.539938 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:29.539897263 +0000 UTC m=+20.007204801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.540010 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:29.539976065 +0000 UTC m=+20.007283623 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.587181 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.587209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.587218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.587231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.587240 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.689433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.689490 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.689503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.689522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.689536 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.791373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.791402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.791411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.791425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.791434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.839401 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:28 crc kubenswrapper[4747]: E1205 20:42:28.839511 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.893322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.893359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.893369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.893382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.893391 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.966214 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nm4fd"] Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.966509 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nm4fd" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.970749 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.971205 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kf4wd"] Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.971836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.972175 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.972757 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.973395 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.973536 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.979309 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-7lblw"] Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.979565 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980251 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980646 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980682 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980766 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980795 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zcn6n"] Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980895 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.980924 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.981243 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.981282 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.981385 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.982216 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.982292 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.983685 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.984188 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.984738 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.984763 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.985081 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.985375 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.986217 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.986661 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fql7t" event={"ID":"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0","Type":"ContainerStarted","Data":"1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.986853 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fql7t" event={"ID":"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0","Type":"ContainerStarted","Data":"3cc76c76ab37f97c59e07912e8709d9f0067545bba38c41676c086cf97d0a417"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.988169 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.988209 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.988228 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5c1ba737fa1a23a63685d6d25de36d9e1200b879c17fed4359da375d3b0aa373"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.990439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.990484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d4a4bbe5ebed0f35f49b2dd75c29593859ae5e27a381008ab0d29bb332c46947"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.993950 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc9452ba1a1aa408c4efe84c0c1ba6cccd2ca4e1bd79687633612d4a45000501"} Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.996805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.996835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.996845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.996859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:28 crc kubenswrapper[4747]: I1205 20:42:28.996869 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:28Z","lastTransitionTime":"2025-12-05T20:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.002238 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.014253 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.029364 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.040648 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043712 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnrw\" (UniqueName: \"kubernetes.io/projected/8e9d6586-09af-4144-8e5d-01ad9fab33d0-kube-api-access-8fnrw\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043846 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/85ba28a1-00e9-438e-9b47-6537f75121bb-rootfs\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043872 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-system-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043894 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-etc-kubernetes\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.043964 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044019 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-multus-certs\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044040 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044061 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044131 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044171 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044196 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044216 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-os-release\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044365 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044387 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-conf-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044458 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-k8s-cni-cncf-io\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044497 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-hostroot\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044517 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044536 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044573 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-multus\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044671 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044692 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrls\" (UniqueName: \"kubernetes.io/projected/85ba28a1-00e9-438e-9b47-6537f75121bb-kube-api-access-4hrls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044754 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-kubelet\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044777 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044797 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85ba28a1-00e9-438e-9b47-6537f75121bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044840 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-socket-dir-parent\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044864 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cnibin\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044917 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.044939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cnibin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045064 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-netns\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5j2m\" (UniqueName: \"kubernetes.io/projected/53f1e522-a732-4821-b7b0-6f1b6670c1d4-kube-api-access-b5j2m\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045113 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-bin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045154 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-daemon-config\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045195 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6rn\" (UniqueName: \"kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045355 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85ba28a1-00e9-438e-9b47-6537f75121bb-proxy-tls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045411 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cni-binary-copy\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045478 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-os-release\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.045512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.058929 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.072177 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.097359 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.098972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.099033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.099046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.099063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.099076 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.108853 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.117742 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.129464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.142696 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.145927 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/85ba28a1-00e9-438e-9b47-6537f75121bb-rootfs\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.145955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-system-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146012 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/85ba28a1-00e9-438e-9b47-6537f75121bb-rootfs\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146034 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-etc-kubernetes\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnrw\" (UniqueName: \"kubernetes.io/projected/8e9d6586-09af-4144-8e5d-01ad9fab33d0-kube-api-access-8fnrw\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146089 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-etc-kubernetes\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146092 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-multus-certs\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-system-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146221 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-multus-certs\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146242 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146243 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146274 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146194 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146308 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146343 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147081 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.146520 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-cni-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147203 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147242 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147294 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147297 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147323 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147345 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147525 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-os-release\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-os-release\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147833 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147855 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147873 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147879 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147908 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147921 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-k8s-cni-cncf-io\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147943 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-hostroot\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147946 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-k8s-cni-cncf-io\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147966 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-conf-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147990 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148012 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-multus\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.147989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-hostroot\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148016 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-conf-dir\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148038 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148069 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148093 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrls\" (UniqueName: \"kubernetes.io/projected/85ba28a1-00e9-438e-9b47-6537f75121bb-kube-api-access-4hrls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148069 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-multus\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-kubelet\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148132 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148136 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148149 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148170 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148171 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-kubelet\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148201 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85ba28a1-00e9-438e-9b47-6537f75121bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148217 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-socket-dir-parent\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cnibin\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148274 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148287 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cnibin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-netns\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5j2m\" (UniqueName: \"kubernetes.io/projected/53f1e522-a732-4821-b7b0-6f1b6670c1d4-kube-api-access-b5j2m\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cnibin\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148332 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85ba28a1-00e9-438e-9b47-6537f75121bb-proxy-tls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cni-binary-copy\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148401 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-bin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-daemon-config\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148433 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6rn\" (UniqueName: \"kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-os-release\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148445 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148478 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148566 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148671 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-var-lib-cni-bin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148750 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148846 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cnibin\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148850 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-host-run-netns\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148892 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-socket-dir-parent\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.148935 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e9d6586-09af-4144-8e5d-01ad9fab33d0-os-release\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149250 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85ba28a1-00e9-438e-9b47-6537f75121bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-multus-daemon-config\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53f1e522-a732-4821-b7b0-6f1b6670c1d4-cni-binary-copy\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.149689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e9d6586-09af-4144-8e5d-01ad9fab33d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.150891 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.152867 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85ba28a1-00e9-438e-9b47-6537f75121bb-proxy-tls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.156265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.167541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5j2m\" (UniqueName: \"kubernetes.io/projected/53f1e522-a732-4821-b7b0-6f1b6670c1d4-kube-api-access-b5j2m\") pod \"multus-nm4fd\" (UID: \"53f1e522-a732-4821-b7b0-6f1b6670c1d4\") " pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.168839 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6rn\" (UniqueName: \"kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn\") pod \"ovnkube-node-kf4wd\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.172663 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnrw\" (UniqueName: \"kubernetes.io/projected/8e9d6586-09af-4144-8e5d-01ad9fab33d0-kube-api-access-8fnrw\") pod \"multus-additional-cni-plugins-zcn6n\" (UID: \"8e9d6586-09af-4144-8e5d-01ad9fab33d0\") " pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.177870 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrls\" (UniqueName: \"kubernetes.io/projected/85ba28a1-00e9-438e-9b47-6537f75121bb-kube-api-access-4hrls\") pod \"machine-config-daemon-7lblw\" (UID: \"85ba28a1-00e9-438e-9b47-6537f75121bb\") " pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.182161 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.201646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.201693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.201702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.201718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.201727 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.211431 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.234384 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.251009 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.270423 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.303336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nm4fd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.304470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.304527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.304541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.304563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.304592 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.310572 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.319100 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:42:29 crc kubenswrapper[4747]: W1205 20:42:29.319416 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f1e522_a732_4821_b7b0_6f1b6670c1d4.slice/crio-a84876f51889e790c66677008c10d5528eae96fec64b9dcc07637dda0f20df6f WatchSource:0}: Error finding container a84876f51889e790c66677008c10d5528eae96fec64b9dcc07637dda0f20df6f: Status 404 returned error can't find the container with id a84876f51889e790c66677008c10d5528eae96fec64b9dcc07637dda0f20df6f Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.318025 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: W1205 20:42:29.323521 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4881e707_c00c_4e49_8e30_a17719e80915.slice/crio-7322703cc3c0fca5940d7da14e76bfd403fef39c1a8163aa44cfb4499972ec41 WatchSource:0}: Error finding container 7322703cc3c0fca5940d7da14e76bfd403fef39c1a8163aa44cfb4499972ec41: Status 404 returned error can't find the container with id 7322703cc3c0fca5940d7da14e76bfd403fef39c1a8163aa44cfb4499972ec41 Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.323963 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.340366 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: W1205 20:42:29.354769 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ba28a1_00e9_438e_9b47_6537f75121bb.slice/crio-0df4295d1ac252ae6c232e12956b8a97fc65acf21239f34ca4eafa1342da390b WatchSource:0}: Error finding container 0df4295d1ac252ae6c232e12956b8a97fc65acf21239f34ca4eafa1342da390b: Status 404 returned error can't find the container with id 0df4295d1ac252ae6c232e12956b8a97fc65acf21239f34ca4eafa1342da390b Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.372028 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.412477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.412515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.412525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.412541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.412550 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.432370 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.444944 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.460093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.460342 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.460341 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.460450 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:31.460430713 +0000 UTC m=+21.927738191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.515444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.515844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.515856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.515871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.515882 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.561419 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.561556 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.561611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561635 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:31.561606004 +0000 UTC m=+22.028913492 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.561681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561694 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561740 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561768 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561784 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561749 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:31.561733357 +0000 UTC m=+22.029040925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561861 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:31.561844049 +0000 UTC m=+22.029151597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561866 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561885 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561900 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.561946 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:31.561936801 +0000 UTC m=+22.029244339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.619219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.619308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.619320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.619342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.619356 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.721481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.721520 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.721531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.721546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.721558 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.823892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.823959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.823972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.824013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.824027 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.839218 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.839268 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.839351 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:29 crc kubenswrapper[4747]: E1205 20:42:29.839439 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.843423 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.844281 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.845179 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.845958 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.846700 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.847322 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.848077 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.848834 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.849592 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.850223 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.850905 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.854227 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.854993 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.855627 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.856831 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.857489 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.858778 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.859321 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.860012 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.861341 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.861942 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.863127 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.863890 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.865208 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.865834 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.866570 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.868420 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.869047 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.870670 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.871294 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.872572 4747 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.872758 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.873275 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.875433 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.876301 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.878486 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.880376 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.881268 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.883800 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.884899 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.887230 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.888180 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.888896 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.890152 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.890967 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.895731 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.896287 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.897408 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.898402 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.899779 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.900347 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.900997 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.902049 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.902834 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.903063 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.903714 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.904315 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.931983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.933198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.933226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.933235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.933249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.933258 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:29Z","lastTransitionTime":"2025-12-05T20:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.951608 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.976652 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.984249 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.989050 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.989301 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.993650 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.998886 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.998931 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd"} Dec 05 20:42:29 crc kubenswrapper[4747]: I1205 20:42:29.998940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"0df4295d1ac252ae6c232e12956b8a97fc65acf21239f34ca4eafa1342da390b"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.000891 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8" exitCode=0 Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.000962 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.001000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"7322703cc3c0fca5940d7da14e76bfd403fef39c1a8163aa44cfb4499972ec41"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.002472 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerStarted","Data":"b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.002505 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerStarted","Data":"a84876f51889e790c66677008c10d5528eae96fec64b9dcc07637dda0f20df6f"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.005147 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.005529 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d" exitCode=0 Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.005571 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.005633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerStarted","Data":"38327e06cb555eafd778d8098f4eedace4f79e47c73a4e6e0e6c4077b8305468"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.021148 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036354 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036608 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.036620 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.049333 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.066538 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.079549 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.097021 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.109547 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.121143 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.135622 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.138825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.138874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.138883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.138898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.138909 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.149354 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.160265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.175961 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.185918 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.195338 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.204880 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.213006 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.224835 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.241400 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.242168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.242202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.242213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.242230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.242242 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.250775 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.344669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.344924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.344933 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.344948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.344958 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.447691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.447739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.447755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.447777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.447789 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.550939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.550979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.550989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.551007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.551016 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.652861 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.653142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.653156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.653173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.653185 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.755576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.755625 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.755635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.755649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.755660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.839173 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:30 crc kubenswrapper[4747]: E1205 20:42:30.839278 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.857863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.857902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.857911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.857926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.857934 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.959641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.959683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.959693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.959707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:30 crc kubenswrapper[4747]: I1205 20:42:30.959716 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:30Z","lastTransitionTime":"2025-12-05T20:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011063 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011117 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011127 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.011145 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.012871 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa" exitCode=0 Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.012947 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.017570 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.029907 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.045403 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.062379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.062422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.062433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.062450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.062461 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.065240 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.081773 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.094242 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.105988 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.118631 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.129669 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.140710 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.157299 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.164513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.164550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.164563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.164600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.164613 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.169308 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.181293 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.192132 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.203761 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.216148 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.237842 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.255663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.266891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.266941 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.266953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.266971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.266985 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.271612 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.282116 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.296246 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.312447 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.322862 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.334508 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.346213 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.356370 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.367364 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.369058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.369086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.369094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.369106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.369115 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.383513 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.402992 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.474303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.474357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.474374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.474405 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.474421 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.484338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.484465 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.484527 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:35.484510857 +0000 UTC m=+25.951818355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.577327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.577384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.577401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.577424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.577440 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.584672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.584823 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:35.584790869 +0000 UTC m=+26.052098387 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.585015 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.585150 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585164 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585286 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585309 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585365 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:35.585349461 +0000 UTC m=+26.052656979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.585249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585553 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585696 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:35.585681918 +0000 UTC m=+26.052989406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585554 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585869 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.585940 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.586037 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:35.586025606 +0000 UTC m=+26.053333094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.657028 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wbt7t"] Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.657620 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.659658 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.660284 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.659839 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.660236 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.679040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.679078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.679089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.679108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.679121 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.680499 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.694469 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.707308 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.718214 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.734452 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.747666 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.758944 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.770879 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.782043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.782304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.782317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.782336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.782347 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.787946 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.788238 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5kq\" (UniqueName: \"kubernetes.io/projected/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-kube-api-access-sw5kq\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.788318 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-host\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.788389 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-serviceca\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.800889 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.817240 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.828756 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.839435 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.839468 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.839570 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:31 crc kubenswrapper[4747]: E1205 20:42:31.839838 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.846961 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.864655 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.875988 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:31Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.885005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.885052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.885061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.885077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.885086 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.889801 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-serviceca\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.889865 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5kq\" (UniqueName: \"kubernetes.io/projected/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-kube-api-access-sw5kq\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.889899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-host\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.889962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-host\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.897339 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-serviceca\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.909261 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5kq\" (UniqueName: \"kubernetes.io/projected/92e0fc38-67be-4f7f-8fdb-187ce47fc0d9-kube-api-access-sw5kq\") pod \"node-ca-wbt7t\" (UID: \"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\") " pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.974087 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wbt7t" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.987753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.987798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.987810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.987828 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:31 crc kubenswrapper[4747]: I1205 20:42:31.987841 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:31Z","lastTransitionTime":"2025-12-05T20:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.020949 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wbt7t" event={"ID":"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9","Type":"ContainerStarted","Data":"858371801b754b338dff16000893189975e84bdd77bfda6f58deb7e8bfa1763b"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.022865 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df" exitCode=0 Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.023011 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.042507 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.054981 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.068362 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.082160 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.093598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.093674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.093690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.093715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.093733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.111918 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.134000 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.152300 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.188776 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.196230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.196261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.196271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.196286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.196296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.217842 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.231176 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.252310 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.270511 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.285125 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.294772 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.298159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.298199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.298209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.298226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.298236 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.307100 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:32Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.400834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.401165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.401174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.401187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.401196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.503178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.503228 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.503240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.503260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.503272 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.606184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.606234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.606245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.606260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.606278 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.708459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.708513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.708527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.708546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.708564 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.810857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.810892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.810901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.810915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.810926 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.839470 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:32 crc kubenswrapper[4747]: E1205 20:42:32.839641 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.913067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.913132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.913147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.913194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:32 crc kubenswrapper[4747]: I1205 20:42:32.913206 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:32Z","lastTransitionTime":"2025-12-05T20:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.015948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.015987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.016001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.016019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.016034 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.030178 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2" exitCode=0 Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.030269 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.035135 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wbt7t" event={"ID":"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9","Type":"ContainerStarted","Data":"8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.050723 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.067559 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.102365 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.117666 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.120328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.120369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.120391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.120416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.120434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.135943 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.155139 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.173692 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.189991 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.203644 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.223381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.223423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.223444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.223463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.223478 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.224727 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.238987 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.250492 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.262854 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.274374 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.287283 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.301100 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.316265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.326039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.326072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.326082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.326095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.326103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.328246 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.341178 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.364235 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.379745 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.394202 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.411567 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.427558 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.429635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.429687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.429701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.429722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.429735 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.441447 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.457188 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.474522 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.496365 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.511401 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.522955 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.531484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.531527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.531537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.531550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.531560 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.634009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.634046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.634054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.634067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.634076 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.736353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.736391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.736401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.736428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.736437 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839244 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.839820 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:33 crc kubenswrapper[4747]: E1205 20:42:33.839919 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.840592 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:33 crc kubenswrapper[4747]: E1205 20:42:33.840669 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.941741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.941789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.941801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.941821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:33 crc kubenswrapper[4747]: I1205 20:42:33.941835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:33Z","lastTransitionTime":"2025-12-05T20:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.042324 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a" exitCode=0 Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.042420 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.043253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.043397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.043414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.043431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.043443 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.047701 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.068610 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.086062 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.097511 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.111324 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.131426 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.142457 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.146649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.146683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.146691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.146705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.146715 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.162524 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.174499 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.186410 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.199298 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.213451 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.226219 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.238469 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.249304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.249372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.249381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.249397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.249423 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.252400 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.263799 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:34Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.353195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.353616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.353630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.353648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.353660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.457398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.457453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.457470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.457518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.457534 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.560821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.560890 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.560914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.560944 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.560964 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.664068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.664129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.664146 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.664168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.664186 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.767745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.767809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.767825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.767849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.767866 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.838851 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:34 crc kubenswrapper[4747]: E1205 20:42:34.839084 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.871286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.871367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.871386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.871415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.871434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.974074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.974349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.974415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.974474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:34 crc kubenswrapper[4747]: I1205 20:42:34.974536 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:34Z","lastTransitionTime":"2025-12-05T20:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.063223 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e9d6586-09af-4144-8e5d-01ad9fab33d0" containerID="729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4" exitCode=0 Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.063306 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerDied","Data":"729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.077290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.077364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.077391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.077423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.077479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.102216 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.130278 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.147841 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.168161 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.179653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.179686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.179694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.179710 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.179720 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.189641 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.209529 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.225029 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.241509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.254513 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.268396 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.278181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.286152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.286189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.286199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.286214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.286222 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.289139 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.306431 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.317617 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.326169 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:35Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.389189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.389237 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.389248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.389267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.389280 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.492663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.492715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.492730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.492752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.492769 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.526256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.526451 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.526577 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.526549561 +0000 UTC m=+33.993857109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.595681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.595723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.595732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.595747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.595756 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.627400 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.627640 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627662 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.62762573 +0000 UTC m=+34.094933218 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.627710 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.627784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627838 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627878 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627895 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627921 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627966 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.627945707 +0000 UTC m=+34.095253205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.627992 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.627981878 +0000 UTC m=+34.095289376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.628033 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.628066 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.628088 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.628175 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.628151691 +0000 UTC m=+34.095459209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.698221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.698253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.698262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.698274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.698282 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.801923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.802356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.802375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.802400 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.802416 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.839429 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.839463 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.839662 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:35 crc kubenswrapper[4747]: E1205 20:42:35.839795 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.905620 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.905687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.905700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.905724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:35 crc kubenswrapper[4747]: I1205 20:42:35.905742 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:35Z","lastTransitionTime":"2025-12-05T20:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.008161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.008210 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.008224 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.008241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.008254 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.071807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.072178 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.072203 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.080272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" event={"ID":"8e9d6586-09af-4144-8e5d-01ad9fab33d0","Type":"ContainerStarted","Data":"9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.104757 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.107836 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.108148 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.110279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.110318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.110330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.110346 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.110358 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.127723 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.142071 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.160198 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.172890 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.192105 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.207923 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.212748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.212785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.212799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.212815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.212827 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.228354 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.260433 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.280174 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.299249 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.313472 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.315494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.315554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.315570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.315611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.315630 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.335365 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.354504 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.368321 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.381551 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.404407 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.418268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.418324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.418341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.418362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.418378 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.435038 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.449775 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.471577 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.489142 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.511269 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.521250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.521311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.521329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.521352 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.521372 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.527400 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.550207 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.589406 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.609788 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.624991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.625065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.625079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.625099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.625111 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.628355 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.647429 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.662708 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.678847 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:36Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.728040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.728085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.728096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.728112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.728122 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.830962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.831034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.831059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.831090 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.831115 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.839780 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:36 crc kubenswrapper[4747]: E1205 20:42:36.839969 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.934042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.934143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.934172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.934205 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:36 crc kubenswrapper[4747]: I1205 20:42:36.934234 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:36Z","lastTransitionTime":"2025-12-05T20:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.038261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.038326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.038344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.038371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.038390 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.084093 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.141869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.141916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.141932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.141958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.141975 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.244274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.244315 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.244322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.244337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.244347 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.347137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.347174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.347184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.347198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.347207 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.449931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.449965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.449975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.449990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.450002 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.553344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.553400 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.553425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.553457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.553478 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.657184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.657223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.657236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.657253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.657266 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.760067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.760114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.760126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.760144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.760156 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.839077 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.839172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:37 crc kubenswrapper[4747]: E1205 20:42:37.839256 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:37 crc kubenswrapper[4747]: E1205 20:42:37.839326 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.863246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.863280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.863291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.863310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.863325 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.966882 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.966952 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.966970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.966996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:37 crc kubenswrapper[4747]: I1205 20:42:37.967016 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:37Z","lastTransitionTime":"2025-12-05T20:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.070138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.070186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.070200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.070234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.070254 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.089998 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/0.log" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.093724 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02" exitCode=1 Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.093794 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.094844 4747 scope.go:117] "RemoveContainer" containerID="062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.131384 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.146889 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.169340 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.174356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.174407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.174422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.174446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.174463 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.185265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.201381 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.218210 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.232864 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.245590 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.257105 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.273107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.277369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.277401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.277411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.277423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.277434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.288568 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.304956 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.324953 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:37Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI1205 20:42:37.757495 6012 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:42:37.759047 6012 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:42:37.759097 6012 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:42:37.759116 6012 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:42:37.759123 6012 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:42:37.759139 6012 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:42:37.759145 6012 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:42:37.759146 6012 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:42:37.759166 6012 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:42:37.759179 6012 factory.go:656] Stopping watch factory\\\\nI1205 20:42:37.759199 6012 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:42:37.759193 6012 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:42:37.759191 6012 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:42:37.759213 6012 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.340533 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.350626 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.380343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.380376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.380385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.380400 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.380411 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.488663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.488735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.488748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.488784 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.488795 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.522145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.522209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.522221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.522241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.522255 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.553494 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.559153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.559201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.559215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.559232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.559242 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.571087 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.576527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.576590 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.576601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.576619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.576630 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.588200 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.591773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.591826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.591837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.591858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.591870 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.604198 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.609998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.610054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.610068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.610086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.610098 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.621379 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.621504 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.623342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.623376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.623389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.623402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.623412 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.725810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.725851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.725863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.725880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.725890 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.828462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.828510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.828521 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.828539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.828550 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.838675 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:38 crc kubenswrapper[4747]: E1205 20:42:38.838799 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.931257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.931314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.931325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.931352 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:38 crc kubenswrapper[4747]: I1205 20:42:38.931364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:38Z","lastTransitionTime":"2025-12-05T20:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.034142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.034208 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.034226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.034251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.034268 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.100153 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/0.log" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.103562 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.103700 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.121424 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.132749 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.137392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.137437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.137453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.137474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.137490 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.149509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.175444 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:37Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI1205 20:42:37.757495 6012 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:42:37.759047 6012 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:42:37.759097 6012 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:42:37.759116 6012 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:42:37.759123 6012 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:42:37.759139 6012 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:42:37.759145 6012 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:42:37.759146 6012 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:42:37.759166 6012 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:42:37.759179 6012 factory.go:656] Stopping watch factory\\\\nI1205 20:42:37.759199 6012 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:42:37.759193 6012 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:42:37.759191 6012 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:42:37.759213 6012 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.189975 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.202675 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.230974 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.239280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.239337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.239355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.239380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.239399 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.250671 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.268558 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.292248 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.314798 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.335422 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.342104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.342158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.342175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.342196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.342215 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.352134 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.367659 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.383688 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.445020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.445077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.445095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.445120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.445173 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.549344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.549946 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.550040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.550084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.550111 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.653643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.653717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.653738 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.653774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.653811 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.756046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.756100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.756122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.756145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.756161 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.839768 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.839911 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:39 crc kubenswrapper[4747]: E1205 20:42:39.839981 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:39 crc kubenswrapper[4747]: E1205 20:42:39.840160 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.858994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.859070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.859089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.859114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.859035 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.859131 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.876885 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.898926 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.917300 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.930931 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.942909 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.958946 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.966108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.966150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.966169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.966191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.966207 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:39Z","lastTransitionTime":"2025-12-05T20:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.968628 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:39 crc kubenswrapper[4747]: I1205 20:42:39.986601 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:39Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.002399 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:37Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI1205 20:42:37.757495 6012 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:42:37.759047 6012 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:42:37.759097 6012 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:42:37.759116 6012 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:42:37.759123 6012 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:42:37.759139 6012 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:42:37.759145 6012 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:42:37.759146 6012 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:42:37.759166 6012 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:42:37.759179 6012 factory.go:656] Stopping watch factory\\\\nI1205 20:42:37.759199 6012 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:42:37.759193 6012 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:42:37.759191 6012 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:42:37.759213 6012 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.024630 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.045493 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.056686 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.069702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.069757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.069777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.069799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.069815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.074560 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.085307 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.108878 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/1.log" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.109567 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/0.log" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.112303 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" exitCode=1 Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.112338 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.112392 4747 scope.go:117] "RemoveContainer" containerID="062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.113619 4747 scope.go:117] "RemoveContainer" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" Dec 05 20:42:40 crc kubenswrapper[4747]: E1205 20:42:40.113968 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.134479 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.147379 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.161336 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.172119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.172168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.172185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.172208 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.172224 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.173996 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.187661 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.198225 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.211707 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.234980 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:37Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI1205 20:42:37.757495 6012 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:42:37.759047 6012 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:42:37.759097 6012 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:42:37.759116 6012 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:42:37.759123 6012 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:42:37.759139 6012 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:42:37.759145 6012 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:42:37.759146 6012 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:42:37.759166 6012 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:42:37.759179 6012 factory.go:656] Stopping watch factory\\\\nI1205 20:42:37.759199 6012 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:42:37.759193 6012 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:42:37.759191 6012 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:42:37.759213 6012 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.247092 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.259067 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.274534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.274593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.274606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.274624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.274638 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.285937 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.297894 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.308061 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.318428 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.338506 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.378087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.378129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.378140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.378158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.378169 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.481252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.481291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.481300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.481314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.481323 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.585159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.585237 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.585255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.585280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.585295 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.688449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.688558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.688577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.688639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.688662 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.791736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.791785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.791797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.791816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.791828 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.838706 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:40 crc kubenswrapper[4747]: E1205 20:42:40.838903 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.895254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.895327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.895351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.895379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.895404 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.999397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.999437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.999448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.999462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:40 crc kubenswrapper[4747]: I1205 20:42:40.999472 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:40Z","lastTransitionTime":"2025-12-05T20:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.007734 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7"] Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.008666 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.015639 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.015871 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.030450 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.046443 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.057164 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.074870 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.084255 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.084457 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.084512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkdm\" (UniqueName: \"kubernetes.io/projected/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-kube-api-access-mgkdm\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.084698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.085969 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.102445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.102499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.102516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.102535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.102548 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.103973 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.118230 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/1.log" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.124462 4747 scope.go:117] "RemoveContainer" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" Dec 05 20:42:41 crc kubenswrapper[4747]: E1205 20:42:41.124820 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.128147 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062a97cc46d68b4e8401c19fe0cb697775c94746ad42e177f6c7eeda62a7ab02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:37Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI1205 20:42:37.757495 6012 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:42:37.759047 6012 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:42:37.759097 6012 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:42:37.759116 6012 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:42:37.759123 6012 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:42:37.759139 6012 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:42:37.759145 6012 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:42:37.759146 6012 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:42:37.759166 6012 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:42:37.759179 6012 factory.go:656] Stopping watch factory\\\\nI1205 20:42:37.759199 6012 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:42:37.759193 6012 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:42:37.759191 6012 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:42:37.759213 6012 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.144542 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.165011 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.176917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.185665 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.185702 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkdm\" (UniqueName: \"kubernetes.io/projected/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-kube-api-access-mgkdm\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.185734 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.185796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.186655 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.186885 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.188386 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.194855 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.199954 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.203406 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkdm\" (UniqueName: \"kubernetes.io/projected/14e1a6fd-3dc9-4ea2-b14f-afb176512c74-kube-api-access-mgkdm\") pod \"ovnkube-control-plane-749d76644c-2gqg7\" (UID: \"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.209507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.209629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.209656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.209721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.209742 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.218226 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.251344 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.282618 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.295386 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.306021 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.312494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.312535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.312545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.312597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.312609 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.315225 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.324953 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.327750 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" Dec 05 20:42:41 crc kubenswrapper[4747]: W1205 20:42:41.338354 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e1a6fd_3dc9_4ea2_b14f_afb176512c74.slice/crio-1551d98a80c8a52db0425b0cf54ea8977ba62ff07e36865c294238b8f665346f WatchSource:0}: Error finding container 1551d98a80c8a52db0425b0cf54ea8977ba62ff07e36865c294238b8f665346f: Status 404 returned error can't find the container with id 1551d98a80c8a52db0425b0cf54ea8977ba62ff07e36865c294238b8f665346f Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.345284 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.366124 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.377346 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.392083 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.410660 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.414310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.414339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.414362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.414376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.414385 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.423164 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.434555 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.447338 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.458806 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.474278 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.489633 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.501774 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.513118 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:41Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.516943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.516982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.516995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.517014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.517025 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.619972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.620202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.620265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.620330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.620407 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.723348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.723788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.724037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.724225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.724410 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.826443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.827024 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.827117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.827222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.827314 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.839223 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:41 crc kubenswrapper[4747]: E1205 20:42:41.839354 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.839430 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:41 crc kubenswrapper[4747]: E1205 20:42:41.839979 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.930126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.930415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.930487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.930546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:41 crc kubenswrapper[4747]: I1205 20:42:41.930625 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:41Z","lastTransitionTime":"2025-12-05T20:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.033160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.033217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.033230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.033249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.033261 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.128320 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" event={"ID":"14e1a6fd-3dc9-4ea2-b14f-afb176512c74","Type":"ContainerStarted","Data":"6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.128364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" event={"ID":"14e1a6fd-3dc9-4ea2-b14f-afb176512c74","Type":"ContainerStarted","Data":"96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.128374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" event={"ID":"14e1a6fd-3dc9-4ea2-b14f-afb176512c74","Type":"ContainerStarted","Data":"1551d98a80c8a52db0425b0cf54ea8977ba62ff07e36865c294238b8f665346f"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.137763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.137846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.137878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.137912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.137935 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.145115 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.165088 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.203752 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.219158 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.237433 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.240945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.240993 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.241005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.241023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.241034 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.251408 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.263855 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.274436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.284798 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.295596 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.313982 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.336086 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.343252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.343293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.343304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.343319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.343331 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.351763 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.369691 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.388239 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.407699 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.445891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.445924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.445932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.445953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.445963 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.495218 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dcr49"] Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.495844 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: E1205 20:42:42.495930 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.515717 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.537472 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.548601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.548651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.548664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.548682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.548693 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.550286 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.583884 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.600879 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44xn\" (UniqueName: \"kubernetes.io/projected/1e860ee9-69f5-44a1-b414-deab4f78dd0d-kube-api-access-j44xn\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.601000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.604178 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.605450 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.620443 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.639017 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.651957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.652028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.652052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.652081 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.652105 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.661451 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.684755 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.699320 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.701810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j44xn\" (UniqueName: \"kubernetes.io/projected/1e860ee9-69f5-44a1-b414-deab4f78dd0d-kube-api-access-j44xn\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.701904 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: E1205 20:42:42.702054 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:42 crc kubenswrapper[4747]: E1205 20:42:42.702140 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:42:43.20211863 +0000 UTC m=+33.669426128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.720739 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.726515 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44xn\" (UniqueName: \"kubernetes.io/projected/1e860ee9-69f5-44a1-b414-deab4f78dd0d-kube-api-access-j44xn\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.730700 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.740199 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.751363 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.754786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.754844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.754862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.754883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.754900 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.763436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.773649 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.786066 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.801127 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.818163 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.833670 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.838872 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:42 crc kubenswrapper[4747]: E1205 20:42:42.839030 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.845850 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.858634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.858676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.858692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.858711 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.858724 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.868318 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.881946 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.895914 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.906769 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.920507 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.931666 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.943216 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.961535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.961607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.961625 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.961645 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.961661 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:42Z","lastTransitionTime":"2025-12-05T20:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.965652 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.978303 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.988431 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:42 crc kubenswrapper[4747]: I1205 20:42:42.999401 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:42Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.010465 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.019872 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.064109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.064143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.064155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.064172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.064186 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.167427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.167489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.167501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.167521 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.167533 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.206820 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.207023 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.207132 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:42:44.207103899 +0000 UTC m=+34.674411427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.270516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.270607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.270626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.270651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.270673 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.373515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.373561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.373573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.373656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.373681 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.476737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.476844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.476868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.476915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.476947 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.580078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.580124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.580135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.580155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.580167 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.612012 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.612333 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.612473 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:59.612438059 +0000 UTC m=+50.079745587 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.684893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.684995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.685023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.685054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.685074 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.713257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.713446 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713512 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:42:59.713476928 +0000 UTC m=+50.180784456 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713671 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713705 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713727 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713793 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:59.713774774 +0000 UTC m=+50.181082292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.713717 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713828 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713852 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.713863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713870 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713939 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.713993 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:59.713968618 +0000 UTC m=+50.181276136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.714060 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:42:59.7140482 +0000 UTC m=+50.181355718 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.788000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.788042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.788052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.788068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.788081 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.839278 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.839406 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.839419 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.839645 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.839740 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:43 crc kubenswrapper[4747]: E1205 20:42:43.839913 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.891156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.891238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.891264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.891297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.891332 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.994523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.994649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.994675 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.994704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:43 crc kubenswrapper[4747]: I1205 20:42:43.994727 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:43Z","lastTransitionTime":"2025-12-05T20:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.097275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.097330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.097347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.097374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.097393 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.199877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.199942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.199960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.199989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.200009 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.219438 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:44 crc kubenswrapper[4747]: E1205 20:42:44.219634 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:44 crc kubenswrapper[4747]: E1205 20:42:44.219709 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:42:46.219686573 +0000 UTC m=+36.686994091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.303300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.303366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.303390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.303421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.303462 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.406814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.406884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.406902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.406932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.406951 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.510734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.510809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.510837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.510886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.510910 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.614225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.614286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.614310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.614341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.614363 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.717404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.717477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.717496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.717522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.717541 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.819942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.820004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.820027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.820055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.820077 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.839654 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:44 crc kubenswrapper[4747]: E1205 20:42:44.839839 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.924127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.924199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.924216 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.924243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:44 crc kubenswrapper[4747]: I1205 20:42:44.924260 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:44Z","lastTransitionTime":"2025-12-05T20:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.027378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.027449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.027468 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.027494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.027515 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.130855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.130917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.130941 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.130968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.130983 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.233718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.233786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.233805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.233831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.233851 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.337149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.337229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.337254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.337283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.337302 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.436412 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.437854 4747 scope.go:117] "RemoveContainer" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" Dec 05 20:42:45 crc kubenswrapper[4747]: E1205 20:42:45.438163 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.440640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.440691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.440713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.440735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.440754 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.543287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.543363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.543387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.543418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.543442 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.646456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.646522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.646543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.646571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.646626 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.750148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.750197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.750212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.750231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.750243 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.838881 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.838934 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:45 crc kubenswrapper[4747]: E1205 20:42:45.839033 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.839117 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:45 crc kubenswrapper[4747]: E1205 20:42:45.839125 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:45 crc kubenswrapper[4747]: E1205 20:42:45.839233 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.853060 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.853123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.853142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.853167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.853184 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.956602 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.956689 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.956707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.956756 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:45 crc kubenswrapper[4747]: I1205 20:42:45.956773 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:45Z","lastTransitionTime":"2025-12-05T20:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.058857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.058919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.058937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.058962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.058980 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.161942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.162048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.162067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.162091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.162103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.242432 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:46 crc kubenswrapper[4747]: E1205 20:42:46.242680 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:46 crc kubenswrapper[4747]: E1205 20:42:46.242795 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:42:50.242767645 +0000 UTC m=+40.710075143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.265137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.265196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.265209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.265226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.265238 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.367194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.367242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.367250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.367265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.367275 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.470761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.470803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.470812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.470826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.470835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.574812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.574883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.574904 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.574926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.574937 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.678997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.679093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.679107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.679131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.679146 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.787886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.788072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.788093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.788138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.788151 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.838874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:46 crc kubenswrapper[4747]: E1205 20:42:46.839030 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.892428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.892502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.892520 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.892554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.892572 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.995510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.995565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.995612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.995637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:46 crc kubenswrapper[4747]: I1205 20:42:46.995662 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:46Z","lastTransitionTime":"2025-12-05T20:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.099379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.099444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.099466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.099494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.099514 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.203181 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.203556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.203732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.203867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.204012 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.307230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.307312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.307338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.307389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.307421 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.411115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.411470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.411641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.411809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.411949 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.515270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.515716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.515907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.516153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.516342 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.619930 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.620014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.620036 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.620063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.620082 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.723232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.723311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.723333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.723358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.723374 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.826412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.826469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.826488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.826511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.826527 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.838825 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.838937 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:47 crc kubenswrapper[4747]: E1205 20:42:47.839009 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.839047 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:47 crc kubenswrapper[4747]: E1205 20:42:47.839150 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:47 crc kubenswrapper[4747]: E1205 20:42:47.839270 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.929519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.930429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.930614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.930767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:47 crc kubenswrapper[4747]: I1205 20:42:47.930904 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:47Z","lastTransitionTime":"2025-12-05T20:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.033205 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.033269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.033286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.033310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.033328 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.136611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.137499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.137617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.137764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.137862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.241391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.241467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.241501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.241540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.241562 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.344633 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.344672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.344683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.344698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.344709 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.449339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.449416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.449441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.449473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.449501 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.553058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.553119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.553135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.553158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.553176 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.656573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.656655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.656678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.656709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.656732 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.759042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.759172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.759198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.759223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.759250 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.839167 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:48 crc kubenswrapper[4747]: E1205 20:42:48.839387 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.862217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.862270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.862281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.862304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.862316 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.928035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.928081 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.928095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.928115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.928127 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: E1205 20:42:48.942422 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:48Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.947012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.947087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.947107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.947130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.947148 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: E1205 20:42:48.966690 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:48Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.972420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.972461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.972472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.972488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.972501 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:48 crc kubenswrapper[4747]: E1205 20:42:48.993875 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:48Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.998847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.998894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.998906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.998922 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:48 crc kubenswrapper[4747]: I1205 20:42:48.998933 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:48Z","lastTransitionTime":"2025-12-05T20:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.020147 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.025101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.025133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.025144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.025159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.025177 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.046870 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.047023 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.049104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.049141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.049150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.049164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.049175 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.153117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.153204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.153253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.153278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.153296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.255863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.255909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.255921 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.255939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.255951 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.358331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.358386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.358410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.358430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.358443 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.461298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.461361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.461380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.461408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.461432 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.564241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.564302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.564319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.564345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.564363 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.667885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.668000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.668018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.668043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.668061 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.772136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.772222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.772248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.772277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.772301 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.838960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.839003 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.839150 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.839196 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.839351 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:49 crc kubenswrapper[4747]: E1205 20:42:49.839638 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.863727 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.875000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.875091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.875118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.875155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.875181 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.888198 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.905927 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.940366 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.961607 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.977959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.977990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.977999 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.978016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.978026 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:49Z","lastTransitionTime":"2025-12-05T20:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:49 crc kubenswrapper[4747]: I1205 20:42:49.983475 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:49Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.004970 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.029239 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.048823 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.070476 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.080805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.080875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.080893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.080919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.080940 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.095566 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.114026 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.130170 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.148834 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.172292 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.184134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.184220 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.184239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.184292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.184310 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.189982 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.212533 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.286743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.286830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.286848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.287009 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:50 crc kubenswrapper[4747]: E1205 20:42:50.287177 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:50 crc kubenswrapper[4747]: E1205 20:42:50.287246 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:42:58.287229391 +0000 UTC m=+48.754536879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.287754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.287815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.389872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.389926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.389944 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.389967 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.389987 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.491722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.491774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.491785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.491837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.491852 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.594025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.594095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.594111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.594134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.594152 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.701073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.701143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.701169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.701199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.701232 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.804215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.804628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.804777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.804943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.805070 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.838945 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:50 crc kubenswrapper[4747]: E1205 20:42:50.839138 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.906974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.907007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.907016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.907032 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:50 crc kubenswrapper[4747]: I1205 20:42:50.907041 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:50Z","lastTransitionTime":"2025-12-05T20:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.010145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.010184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.010193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.010209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.010219 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.113549 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.113601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.113611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.113626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.113638 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.216628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.216701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.216721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.216748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.216766 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.320050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.320463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.320535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.320694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.320797 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.424207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.424275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.424293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.424320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.424339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.528290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.528351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.528370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.528394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.528415 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.631641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.631678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.631691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.631708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.631720 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.735043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.735084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.735093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.735108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.735119 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838530 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838670 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.838755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:51 crc kubenswrapper[4747]: E1205 20:42:51.838968 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.839034 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:51 crc kubenswrapper[4747]: E1205 20:42:51.839160 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:51 crc kubenswrapper[4747]: E1205 20:42:51.839275 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.942263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.942328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.942345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.942369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:51 crc kubenswrapper[4747]: I1205 20:42:51.942386 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:51Z","lastTransitionTime":"2025-12-05T20:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.045893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.045954 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.045977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.046006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.046029 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.149504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.149617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.149643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.149672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.149694 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.252900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.252992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.253016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.253050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.253075 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.355845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.355885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.355896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.355914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.355926 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.458480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.458545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.458563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.458641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.458667 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.561760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.561788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.561796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.561808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.561817 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.664648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.664690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.664705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.664720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.664731 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.766977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.767012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.767023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.767039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.767051 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.839489 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:52 crc kubenswrapper[4747]: E1205 20:42:52.839749 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.869471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.869504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.869599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.869617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.869629 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.973864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.973926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.973945 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.973970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:52 crc kubenswrapper[4747]: I1205 20:42:52.973989 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:52Z","lastTransitionTime":"2025-12-05T20:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.076461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.076524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.076557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.076636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.076661 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.179558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.179678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.179703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.179734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.179757 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.282969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.283047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.283280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.283333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.283363 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.385765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.385836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.385854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.385880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.385898 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.488844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.488910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.488929 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.488957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.488976 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.592931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.592989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.593009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.593034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.593055 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.696619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.696658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.696666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.696684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.696694 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.800652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.801124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.801152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.801217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.801243 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.839006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:53 crc kubenswrapper[4747]: E1205 20:42:53.839173 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.839660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:53 crc kubenswrapper[4747]: E1205 20:42:53.839734 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.839892 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:53 crc kubenswrapper[4747]: E1205 20:42:53.840077 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.903647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.903697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.903709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.903727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:53 crc kubenswrapper[4747]: I1205 20:42:53.903739 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:53Z","lastTransitionTime":"2025-12-05T20:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.006657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.006716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.006729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.006748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.006760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.109655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.109733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.109751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.109783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.109831 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.212246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.212293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.212305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.212323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.212335 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.316073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.316128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.316143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.316166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.316181 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.419804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.419850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.419862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.419882 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.419894 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.523268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.523330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.523353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.523382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.523405 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.626481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.626560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.626628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.626663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.626686 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.729898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.729946 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.729960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.729978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.729989 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.832732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.832815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.832839 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.832871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.832898 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.838691 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:54 crc kubenswrapper[4747]: E1205 20:42:54.838926 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.935331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.935392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.935410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.935434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:54 crc kubenswrapper[4747]: I1205 20:42:54.935451 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:54Z","lastTransitionTime":"2025-12-05T20:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.039011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.039089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.039114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.039143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.039165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.141910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.141977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.141995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.142020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.142040 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.245210 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.245343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.245376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.245427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.245452 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.348858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.348933 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.348955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.348980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.348998 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.453272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.453351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.453373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.453407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.453428 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.556914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.556978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.556996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.557019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.557038 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.660789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.660859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.660873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.660898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.660920 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.764399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.764466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.764483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.764508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.764528 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.839420 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.839473 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:55 crc kubenswrapper[4747]: E1205 20:42:55.839744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.839784 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:55 crc kubenswrapper[4747]: E1205 20:42:55.839873 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:55 crc kubenswrapper[4747]: E1205 20:42:55.840016 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.868511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.868692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.868723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.868758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.868783 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.971460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.971504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.971512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.971526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:55 crc kubenswrapper[4747]: I1205 20:42:55.971538 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:55Z","lastTransitionTime":"2025-12-05T20:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.074744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.074830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.074855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.074886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.074908 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.177546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.177658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.177682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.177712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.177736 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.280414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.280468 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.280488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.280512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.280530 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.383843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.383916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.383935 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.383960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.383978 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.489071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.489133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.489151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.489174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.489194 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.592802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.592868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.592887 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.592917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.592936 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.697230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.697302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.697317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.697343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.697360 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.801357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.801412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.801430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.801454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.801471 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.839626 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:56 crc kubenswrapper[4747]: E1205 20:42:56.839801 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.840963 4747 scope.go:117] "RemoveContainer" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.905113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.905459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.905480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.905507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:56 crc kubenswrapper[4747]: I1205 20:42:56.905525 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:56Z","lastTransitionTime":"2025-12-05T20:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.009254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.009326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.009362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.009465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.009511 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.111889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.111960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.111985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.112021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.112046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.186898 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/1.log" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.190034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.190578 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.211612 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.214702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.214729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.214737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.214751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.214760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.229652 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.245911 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.270142 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.288703 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317297 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.317657 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.333450 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.357902 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.373415 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.386606 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.401767 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.417133 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.419790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.419838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.419847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.419876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.419888 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.434314 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.445139 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.456862 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.468723 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.481806 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.522433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.522512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.522525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.522547 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.522562 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.624652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.624718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.624742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.624771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.624793 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.727537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.727628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.727646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.727666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.727680 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.830242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.830295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.830305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.830321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.830333 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.839756 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.839770 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:57 crc kubenswrapper[4747]: E1205 20:42:57.839970 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.839775 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:57 crc kubenswrapper[4747]: E1205 20:42:57.840191 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:57 crc kubenswrapper[4747]: E1205 20:42:57.840220 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.932956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.933020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.933040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.933065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:57 crc kubenswrapper[4747]: I1205 20:42:57.933083 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:57Z","lastTransitionTime":"2025-12-05T20:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.037036 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.037084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.037098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.037115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.037130 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.140635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.140713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.140740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.140775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.140799 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.197047 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/2.log" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.198416 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/1.log" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.201528 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" exitCode=1 Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.201573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.201645 4747 scope.go:117] "RemoveContainer" containerID="d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.202468 4747 scope.go:117] "RemoveContainer" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" Dec 05 20:42:58 crc kubenswrapper[4747]: E1205 20:42:58.202702 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.221137 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.233122 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.243189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.243246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.243266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.243288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.243307 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.244474 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.265407 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.280936 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.296770 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.329036 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d519962a220124562427fde1594d25407b91b2adc9dbbf4fddf963262f18459e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:39Z\\\",\\\"message\\\":\\\"place-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 20:42:38.929300 6136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:38Z is after 2025-08-24T17:21:41Z]\\\\nI1205 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.347047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.347124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.347141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.347168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.347186 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.358945 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.369118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:58 crc kubenswrapper[4747]: E1205 20:42:58.369325 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:58 crc kubenswrapper[4747]: E1205 20:42:58.369434 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:43:14.369404457 +0000 UTC m=+64.836711985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.376701 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.410826 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.431244 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.450958 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.451334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.451667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.451688 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.451714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.451733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.471353 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.493838 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.515397 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.536650 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.554356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.554407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.554424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.554449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.554467 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.556946 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:58Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.657280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.657342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.657370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.657402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.657505 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.760817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.760875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.760892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.760916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.760934 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.839815 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:58 crc kubenswrapper[4747]: E1205 20:42:58.840028 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.863163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.863206 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.863216 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.863233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.863245 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.966035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.966078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.966089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.966102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:58 crc kubenswrapper[4747]: I1205 20:42:58.966111 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:58Z","lastTransitionTime":"2025-12-05T20:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.069928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.069984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.070000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.070026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.070044 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.173396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.174375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.174397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.174420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.174466 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.206826 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/2.log" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.210378 4747 scope.go:117] "RemoveContainer" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.210510 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.226216 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.245901 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.265812 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.277699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.277777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.277802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.277837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.277860 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.287602 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.302500 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.304300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.304573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.304633 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.304665 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.304688 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.320637 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.327008 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.337241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.337307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.337330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.337359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.337381 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.339326 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.355222 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.362797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.362869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.362889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.362915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.362934 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.364861 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.383351 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.388319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.388367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.388384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.388408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.388425 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.389972 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.406219 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.407169 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.412059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.412098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.412111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.412128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.412140 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.422361 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.428947 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.429112 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.430990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.431025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.431037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.431058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.431074 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.438788 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.455687 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.465704 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.476477 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.490865 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.501138 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.533783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.533829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.533838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.533851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.533861 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.636059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.636125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.636148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.636178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.636211 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.683381 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.683610 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.683743 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:43:31.683716596 +0000 UTC m=+82.151024114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.738356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.738425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.738447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.738470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.738488 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.784527 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.784741 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.784801 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:43:31.784761755 +0000 UTC m=+82.252069273 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.784885 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.784902 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.784930 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.784948 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.784963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785013 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:43:31.78499158 +0000 UTC m=+82.252299098 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785062 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785187 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:43:31.785157764 +0000 UTC m=+82.252465372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785392 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785433 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785456 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.785540 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:43:31.785522322 +0000 UTC m=+82.252829840 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.839713 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.839960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.839993 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.840498 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.840783 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:42:59 crc kubenswrapper[4747]: E1205 20:42:59.840991 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.842304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.842357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.842419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.842452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.842474 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.864373 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.878649 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.897696 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.930193 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.944885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.944958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.944974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.945023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.945041 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:42:59Z","lastTransitionTime":"2025-12-05T20:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.946396 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.966053 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:42:59 crc kubenswrapper[4747]: I1205 20:42:59.984785 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:42:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.025165 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.041074 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.048915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.048963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.048980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.049003 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.049020 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.055173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.070849 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.095107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.109779 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.132910 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.150956 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.152615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.152789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.152836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.152876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.152897 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.171501 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.192316 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.255131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.255230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.255254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.255285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.255310 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.358049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.358098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.358115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.358139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.358156 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.461245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.461334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.461362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.461397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.461424 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.565434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.565497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.565516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.565541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.565558 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.668831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.668898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.668921 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.668949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.668967 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.772578 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.772666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.772683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.772710 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.772733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.839577 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:00 crc kubenswrapper[4747]: E1205 20:43:00.839858 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.876144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.876182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.876192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.876209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.876221 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.978453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.978530 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.978553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.978637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:00 crc kubenswrapper[4747]: I1205 20:43:00.978662 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:00Z","lastTransitionTime":"2025-12-05T20:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.081554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.081610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.081621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.081655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.081667 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.184202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.184249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.184259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.184276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.184287 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.287629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.287696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.287713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.287738 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.287757 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.390816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.390891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.390909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.390932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.390951 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.494317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.494397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.494418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.494483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.494503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.597483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.597635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.597669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.597703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.597729 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.700450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.700515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.700537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.700566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.700618 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.804046 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.804119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.804143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.804173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.804197 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.839006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.839090 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.839137 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:01 crc kubenswrapper[4747]: E1205 20:43:01.839179 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:01 crc kubenswrapper[4747]: E1205 20:43:01.839296 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:01 crc kubenswrapper[4747]: E1205 20:43:01.839422 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.907229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.907328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.907348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.907373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:01 crc kubenswrapper[4747]: I1205 20:43:01.907394 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:01Z","lastTransitionTime":"2025-12-05T20:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.010293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.010366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.010394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.010424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.010448 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.113298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.113353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.113363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.113381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.113394 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.216654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.216790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.216829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.217423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.217461 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.320916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.321004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.321041 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.321072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.321095 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.423685 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.423740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.423749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.423764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.423775 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.525897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.525948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.525961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.526007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.526023 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.629266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.629370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.629394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.629427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.629450 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.732064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.732115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.732132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.732155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.732173 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.835737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.835809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.835838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.835867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.835889 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.839105 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:02 crc kubenswrapper[4747]: E1205 20:43:02.839262 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.938748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.938784 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.938794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.938808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:02 crc kubenswrapper[4747]: I1205 20:43:02.938819 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:02Z","lastTransitionTime":"2025-12-05T20:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.040956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.041005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.041043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.041078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.041100 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.143553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.143671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.143699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.143730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.143750 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.231852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.241110 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.245456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.245481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.245489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.245501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.245509 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.254669 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.265034 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.280244 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.291949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.305292 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.320382 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.336367 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.347932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.347962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.347971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.347986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.347995 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.348086 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.361940 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.375046 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.389744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.403075 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.420010 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.442544 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.451443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.451506 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.451523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.451549 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.451566 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.460146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.472522 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.486156 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.554349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.554397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.554432 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.554454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.554466 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.657035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.657074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.657084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.657100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.657148 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.759297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.759365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.759377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.759395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.759407 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.839117 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.839250 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:03 crc kubenswrapper[4747]: E1205 20:43:03.839457 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.839507 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:03 crc kubenswrapper[4747]: E1205 20:43:03.839738 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:03 crc kubenswrapper[4747]: E1205 20:43:03.839877 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.862338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.862423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.862445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.862471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.862490 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.965932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.965980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.965997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.966188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:03 crc kubenswrapper[4747]: I1205 20:43:03.966203 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:03Z","lastTransitionTime":"2025-12-05T20:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.068909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.069009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.069026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.069049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.069066 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.172167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.172232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.172267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.172317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.172338 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.274415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.274498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.274531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.274559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.274626 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.378516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.378574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.378644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.378692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.378722 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.482546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.482659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.482690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.482723 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.482743 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.585926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.585980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.586002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.586023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.586040 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.691037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.691127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.691153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.691187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.691222 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.794111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.794162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.794175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.794194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.794209 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.839019 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:04 crc kubenswrapper[4747]: E1205 20:43:04.839187 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.898153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.898215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.898232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.898251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:04 crc kubenswrapper[4747]: I1205 20:43:04.898263 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:04Z","lastTransitionTime":"2025-12-05T20:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.001820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.001881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.001900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.001924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.001941 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.104173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.104217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.104230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.104247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.104258 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.206972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.207021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.207033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.207051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.207063 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.309892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.309955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.309975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.310001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.310018 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.412772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.412874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.412950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.412986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.413007 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.515791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.515842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.515857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.515881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.515899 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.618658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.618747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.618790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.618827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.618853 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.721671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.721711 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.721722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.721740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.721750 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.825377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.825435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.825454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.825477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.825496 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.839471 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.839524 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:05 crc kubenswrapper[4747]: E1205 20:43:05.839704 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.839747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:05 crc kubenswrapper[4747]: E1205 20:43:05.839862 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:05 crc kubenswrapper[4747]: E1205 20:43:05.840023 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.927910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.927964 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.927975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.927995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:05 crc kubenswrapper[4747]: I1205 20:43:05.928006 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:05Z","lastTransitionTime":"2025-12-05T20:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.030744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.030834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.030871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.030905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.030928 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.133379 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.133425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.133435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.133470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.133479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.236781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.236821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.236829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.236845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.236854 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.340540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.340622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.340634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.340659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.340671 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.443660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.443731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.443750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.443775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.443792 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.546858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.546943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.546962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.546989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.547006 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.650466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.650525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.650541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.650566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.650620 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.754422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.754484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.754503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.754527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.754544 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.839787 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:06 crc kubenswrapper[4747]: E1205 20:43:06.840030 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.858324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.858397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.858416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.858445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.858464 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.961516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.961574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.961598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.961621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:06 crc kubenswrapper[4747]: I1205 20:43:06.961634 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:06Z","lastTransitionTime":"2025-12-05T20:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.064150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.064227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.064247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.064282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.064307 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.167743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.167797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.167812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.167833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.167850 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.270115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.270183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.270211 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.270242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.270264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.373121 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.373204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.373223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.373248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.373269 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.476286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.476365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.476390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.476421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.476445 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.579688 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.579758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.579776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.579801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.579865 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.684368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.684470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.684501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.684630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.684677 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.788042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.788137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.788195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.788229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.788300 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.839188 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.839270 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:07 crc kubenswrapper[4747]: E1205 20:43:07.839447 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.839465 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:07 crc kubenswrapper[4747]: E1205 20:43:07.839567 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:07 crc kubenswrapper[4747]: E1205 20:43:07.839683 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.892386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.892474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.892531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.892565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.892632 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.995422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.995487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.995508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.995531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:07 crc kubenswrapper[4747]: I1205 20:43:07.995548 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:07Z","lastTransitionTime":"2025-12-05T20:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.100128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.100244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.100263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.100327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.100345 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.204201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.204298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.204349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.204380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.204398 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.307449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.307648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.307672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.307728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.307748 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.410213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.410288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.410312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.410341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.410364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.512821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.512869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.512884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.512905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.512917 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.615428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.615473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.615482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.615499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.615508 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.718785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.718899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.718923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.718953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.718973 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.822109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.822186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.822209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.822239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.822262 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.839784 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:08 crc kubenswrapper[4747]: E1205 20:43:08.839977 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.925105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.925190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.925209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.925238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:08 crc kubenswrapper[4747]: I1205 20:43:08.925260 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:08Z","lastTransitionTime":"2025-12-05T20:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.028731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.028827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.028851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.028875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.028891 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.132537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.132690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.132714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.132748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.132767 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.235975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.236056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.236078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.236107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.236128 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.340551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.340620 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.340638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.340660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.340676 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.435914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.435974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.435990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.436014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.436029 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.456718 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.461866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.461924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.461942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.461965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.461982 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.481782 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.485828 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.485889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.485906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.485931 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.485951 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.500712 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.506069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.506112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.506122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.506140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.506150 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.523869 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.528612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.528919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.528928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.528942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.528953 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.546862 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.546987 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.548270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.548323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.548337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.548353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.548364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.651284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.651361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.651386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.651416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.651438 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.754679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.754721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.754731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.754749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.754760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.839241 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.839339 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.839395 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.839526 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.839687 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:09 crc kubenswrapper[4747]: E1205 20:43:09.839816 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.857422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.857531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.857555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.857653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.857686 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.862268 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.874373 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.885509 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.897964 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.913369 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.923966 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.937387 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.951294 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.960279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.960331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.960340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.960354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.960365 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:09Z","lastTransitionTime":"2025-12-05T20:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.967464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.979112 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:09 crc kubenswrapper[4747]: I1205 20:43:09.994343 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:09Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.009634 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.021855 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.037566 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.058948 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.062622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.062682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.062696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.062716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.062729 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.076141 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.088824 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.100260 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.165246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.165299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.165313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.165335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.165349 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.268136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.268207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.268232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.268264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.268286 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.371035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.371112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.371132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.371156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.371172 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.474446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.474673 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.474702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.474747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.474771 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.578558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.578653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.578661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.578677 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.578686 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.680805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.680836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.680845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.680857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.680865 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.782889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.782950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.782973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.782997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.783015 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.839068 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:10 crc kubenswrapper[4747]: E1205 20:43:10.839255 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.885968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.886017 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.886033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.886054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.886069 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.988896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.988980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.989002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.989029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:10 crc kubenswrapper[4747]: I1205 20:43:10.989048 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:10Z","lastTransitionTime":"2025-12-05T20:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.092372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.092446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.092470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.092499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.092521 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.195351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.195471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.195508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.195538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.195558 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.300176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.300242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.300264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.300290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.300310 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.403269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.403338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.403363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.403392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.403414 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.507133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.507191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.507210 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.507229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.507240 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.610236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.610319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.610333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.610350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.610361 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.713280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.713334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.713372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.713392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.713406 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.815515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.815566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.815574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.815632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.815646 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.839344 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.839393 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:11 crc kubenswrapper[4747]: E1205 20:43:11.839507 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.839571 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:11 crc kubenswrapper[4747]: E1205 20:43:11.839667 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:11 crc kubenswrapper[4747]: E1205 20:43:11.839771 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.918628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.918702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.918721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.918749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:11 crc kubenswrapper[4747]: I1205 20:43:11.918769 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:11Z","lastTransitionTime":"2025-12-05T20:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.021838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.021901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.021918 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.021943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.021960 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.124766 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.124820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.124832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.124849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.124862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.227647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.227722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.227743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.227773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.227794 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.331044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.331102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.331141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.331172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.331192 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.434129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.434196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.434214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.434238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.434256 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.541139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.541190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.541199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.541217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.541226 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.643652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.643694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.643702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.643719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.643728 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.745900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.745949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.745961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.745978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.745992 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.839272 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:12 crc kubenswrapper[4747]: E1205 20:43:12.839512 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.848229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.848293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.848304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.848322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.848334 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.951345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.951433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.951451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.951475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:12 crc kubenswrapper[4747]: I1205 20:43:12.951491 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:12Z","lastTransitionTime":"2025-12-05T20:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.054336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.054610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.054643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.054677 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.054703 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.157682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.157732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.157744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.157760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.157774 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.259443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.259501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.259517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.259537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.259554 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.362106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.362182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.362202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.362231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.362250 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.465129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.465194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.465212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.465237 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.465254 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.567733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.567803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.567823 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.567849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.567872 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.670009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.670515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.670602 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.670683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.670756 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.773095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.773138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.773149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.773167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.773178 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.838765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.839025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.838874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:13 crc kubenswrapper[4747]: E1205 20:43:13.839496 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:13 crc kubenswrapper[4747]: E1205 20:43:13.839630 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:13 crc kubenswrapper[4747]: E1205 20:43:13.839710 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.839905 4747 scope.go:117] "RemoveContainer" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" Dec 05 20:43:13 crc kubenswrapper[4747]: E1205 20:43:13.840146 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.875091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.875122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.875131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.875144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.875155 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.978883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.978956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.978977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.979006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:13 crc kubenswrapper[4747]: I1205 20:43:13.979028 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:13Z","lastTransitionTime":"2025-12-05T20:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.082385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.082458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.082487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.082518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.082535 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.184478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.184526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.184540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.184559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.184569 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.288630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.288722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.288747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.288780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.288814 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.391204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.391236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.391248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.391263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.391274 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.445108 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:14 crc kubenswrapper[4747]: E1205 20:43:14.445363 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:43:14 crc kubenswrapper[4747]: E1205 20:43:14.445442 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:43:46.445406287 +0000 UTC m=+96.912713785 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.494487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.494543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.494558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.494574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.494615 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.596784 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.596832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.596842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.596854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.596862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.699376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.699421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.699434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.699454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.699471 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.803244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.803298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.803317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.803342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.803361 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.839708 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:14 crc kubenswrapper[4747]: E1205 20:43:14.839928 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.906102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.906214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.906242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.906276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:14 crc kubenswrapper[4747]: I1205 20:43:14.906303 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:14Z","lastTransitionTime":"2025-12-05T20:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.008199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.008232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.008242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.008254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.008264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.111071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.111126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.111138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.111156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.111169 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.214062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.214131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.214144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.214166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.214179 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.316453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.316902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.316915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.316930 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.316939 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.420053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.420111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.420129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.420151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.420165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.523160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.523207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.523217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.523235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.523246 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.625888 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.625926 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.625936 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.625954 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.625965 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.729384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.729436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.729447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.729466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.729477 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.832066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.832143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.832168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.832205 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.832232 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.839408 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.839460 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.839420 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:15 crc kubenswrapper[4747]: E1205 20:43:15.839658 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:15 crc kubenswrapper[4747]: E1205 20:43:15.839689 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:15 crc kubenswrapper[4747]: E1205 20:43:15.839798 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.935136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.935176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.935188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.935203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:15 crc kubenswrapper[4747]: I1205 20:43:15.935215 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:15Z","lastTransitionTime":"2025-12-05T20:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.037293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.037331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.037370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.037387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.037399 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.139966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.140006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.140014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.140029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.140040 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.242280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.242342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.242353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.242370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.242379 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.269391 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/0.log" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.269650 4747 generic.go:334] "Generic (PLEG): container finished" podID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" containerID="b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d" exitCode=1 Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.269682 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerDied","Data":"b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.270441 4747 scope.go:117] "RemoveContainer" containerID="b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.288605 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.302072 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.312752 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.323836 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.333558 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.344132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.344165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.344175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.344191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.344204 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.348418 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.358815 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.369568 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.388386 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.398075 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.408037 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.418077 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.435301 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447261 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.447789 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.459033 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.475874 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.490459 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.500741 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.549296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.549616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.549751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.549866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.549956 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.653143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.653191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.653209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.653231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.653247 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.756857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.756903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.756919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.756942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.756959 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.839118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:16 crc kubenswrapper[4747]: E1205 20:43:16.839278 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.858962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.858993 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.859005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.859019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.859029 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.961609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.961655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.961666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.961681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:16 crc kubenswrapper[4747]: I1205 20:43:16.961691 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:16Z","lastTransitionTime":"2025-12-05T20:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.063730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.063763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.063774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.063792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.063805 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.165695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.165722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.165734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.165750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.165763 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.267778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.267849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.267873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.267896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.267912 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.278467 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/0.log" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.278548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerStarted","Data":"a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.298846 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.311501 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.322410 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.333920 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.347533 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.358848 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.369704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.369775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.369787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.369803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.369815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.372878 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.384743 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.400222 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.416601 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.427750 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.446078 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.459519 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.471928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.471962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.471973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.471990 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.472010 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.476188 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.486555 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.504641 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.517040 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.533556 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:17Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.573890 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.573920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.573928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.573940 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.573949 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.676182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.676226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.676236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.676251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.676261 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.778854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.778928 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.778950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.778975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.778994 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.839863 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.839890 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:17 crc kubenswrapper[4747]: E1205 20:43:17.840063 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.840060 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:17 crc kubenswrapper[4747]: E1205 20:43:17.840285 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:17 crc kubenswrapper[4747]: E1205 20:43:17.840482 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.881473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.881516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.881529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.881601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.881621 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.983963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.984004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.984013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.984030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:17 crc kubenswrapper[4747]: I1205 20:43:17.984039 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:17Z","lastTransitionTime":"2025-12-05T20:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.086622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.086669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.086679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.086695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.086708 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.189352 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.189408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.189421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.189436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.189449 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.291391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.291426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.291435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.291449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.291459 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.394087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.394131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.394144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.394163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.394177 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.497431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.497484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.497545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.497566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.497622 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.601765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.601831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.601854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.601885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.601923 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.712445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.712495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.712512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.712533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.712550 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.815959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.816011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.816023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.816088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.816104 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.839763 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:18 crc kubenswrapper[4747]: E1205 20:43:18.839978 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.918814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.918877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.918896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.918923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:18 crc kubenswrapper[4747]: I1205 20:43:18.918945 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:18Z","lastTransitionTime":"2025-12-05T20:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.021663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.021721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.021739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.021764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.021790 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.124920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.125010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.125040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.125070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.125092 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.228075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.228152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.228175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.228204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.228223 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.331571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.331639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.331650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.331666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.331681 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.434477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.434531 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.434550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.434573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.434615 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.536942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.536998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.537015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.537038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.537054 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.639729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.639794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.639815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.639840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.639858 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.742460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.742509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.742519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.742537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.742549 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.838929 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.839031 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:19 crc kubenswrapper[4747]: E1205 20:43:19.839147 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.839200 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:19 crc kubenswrapper[4747]: E1205 20:43:19.839324 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:19 crc kubenswrapper[4747]: E1205 20:43:19.839425 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.850023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.850063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.850075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.850094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.850105 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.855319 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.869058 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.887092 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.906182 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.922300 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.933507 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.942443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.942484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.942497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.942513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.942523 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.948895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: E1205 20:43:19.957996 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.961937 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.961977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.961992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.962010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.962022 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.962875 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: E1205 20:43:19.978973 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.982654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.982704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.982715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.982731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.982760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:19Z","lastTransitionTime":"2025-12-05T20:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:19 crc kubenswrapper[4747]: I1205 20:43:19.989968 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:19Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: E1205 20:43:20.003909 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.005982 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.007659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.007763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.007829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.007905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.007976 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.021149 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: E1205 20:43:20.025712 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.028970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.029055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.029116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.029189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.029263 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.038873 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: E1205 20:43:20.046856 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: E1205 20:43:20.047182 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.048972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.049068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.049131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.049195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.049258 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.060009 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.071594 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.088629 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.119705 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.144179 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.152067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.152100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.152109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.152127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.152136 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.165823 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.254473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.254497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.254505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.254518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.254526 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.357074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.357116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.357125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.357140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.357152 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.459907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.459969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.459987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.460012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.460030 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.562573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.562631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.562638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.562653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.562663 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.666063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.666331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.666420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.666504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.666571 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.769074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.769441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.769623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.769783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.769918 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.839661 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:20 crc kubenswrapper[4747]: E1205 20:43:20.839799 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.873284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.873355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.873371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.873399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.873420 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.976557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.976987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.977146 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.977294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:20 crc kubenswrapper[4747]: I1205 20:43:20.977434 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:20Z","lastTransitionTime":"2025-12-05T20:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.081954 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.082023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.082044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.082070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.082094 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.184889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.184965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.184985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.185014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.185032 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.288423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.288490 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.288507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.288536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.288554 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.391397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.391482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.391499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.391524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.391541 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.495190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.495302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.495325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.495354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.495377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.597811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.597846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.597860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.598098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.598126 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.700630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.700668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.700678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.700695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.700707 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.803632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.803681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.803693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.803713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.803727 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.839650 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.839761 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:21 crc kubenswrapper[4747]: E1205 20:43:21.839822 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.839650 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:21 crc kubenswrapper[4747]: E1205 20:43:21.839972 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:21 crc kubenswrapper[4747]: E1205 20:43:21.840026 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.906569 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.906677 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.906699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.906726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:21 crc kubenswrapper[4747]: I1205 20:43:21.906744 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:21Z","lastTransitionTime":"2025-12-05T20:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.010207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.010260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.010274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.010293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.010306 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.113720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.113788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.113812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.113843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.113867 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.217262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.217318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.217335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.217360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.217380 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.320718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.320792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.320810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.320837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.320854 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.424073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.424127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.424144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.424168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.424186 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.526048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.526091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.526104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.526119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.526127 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.629180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.629219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.629229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.629244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.629254 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.732116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.732186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.732199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.732223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.732234 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.835203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.835243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.835252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.835268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.835278 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.839669 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:22 crc kubenswrapper[4747]: E1205 20:43:22.839834 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.938772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.938844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.938866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.938893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:22 crc kubenswrapper[4747]: I1205 20:43:22.938910 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:22Z","lastTransitionTime":"2025-12-05T20:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.041215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.041257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.041270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.041285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.041296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.144700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.144757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.144774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.144799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.144816 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.247895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.247952 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.247973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.248000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.248026 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.351014 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.351063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.351080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.351103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.351119 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.454056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.454114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.454126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.454143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.454186 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.557020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.557093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.557110 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.557129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.557144 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.659883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.659960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.659985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.660018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.660041 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.764103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.764469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.764640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.764797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.764919 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.839877 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:23 crc kubenswrapper[4747]: E1205 20:43:23.840327 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.840132 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:23 crc kubenswrapper[4747]: E1205 20:43:23.840831 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.840047 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:23 crc kubenswrapper[4747]: E1205 20:43:23.841149 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.868075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.868139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.868156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.868180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.868197 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.970641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.970702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.970722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.970749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:23 crc kubenswrapper[4747]: I1205 20:43:23.970766 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:23Z","lastTransitionTime":"2025-12-05T20:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.072836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.073012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.073086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.073159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.073228 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.174985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.175164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.175245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.175324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.175393 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.277410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.277862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.278004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.278174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.278309 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.382034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.382369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.382499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.382657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.382797 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.485763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.485808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.485821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.485836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.485845 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.589141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.589209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.589282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.589313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.589334 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.691683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.691746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.691765 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.691790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.691817 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.795875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.796070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.796110 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.796147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.796176 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.839752 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:24 crc kubenswrapper[4747]: E1205 20:43:24.839941 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.900253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.900377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.900448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.900519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:24 crc kubenswrapper[4747]: I1205 20:43:24.900546 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:24Z","lastTransitionTime":"2025-12-05T20:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.003374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.003446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.003470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.003500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.003518 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.106341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.106410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.106431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.106454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.106471 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.209624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.209690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.209711 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.209735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.209751 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.312546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.312648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.312672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.312717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.312738 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.415788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.415872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.415909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.415949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.415985 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.519075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.519138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.519160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.519189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.519211 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.622318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.622390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.622416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.622445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.622466 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.725822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.725891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.725909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.725933 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.725953 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.829323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.829393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.829420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.829449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.829473 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.839134 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.839193 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:25 crc kubenswrapper[4747]: E1205 20:43:25.839281 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.839337 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:25 crc kubenswrapper[4747]: E1205 20:43:25.839423 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:25 crc kubenswrapper[4747]: E1205 20:43:25.839879 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.937222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.937288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.937307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.937351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:25 crc kubenswrapper[4747]: I1205 20:43:25.937370 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:25Z","lastTransitionTime":"2025-12-05T20:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.041482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.042503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.042720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.042942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.043152 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.145881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.145918 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.145927 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.145943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.145953 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.248715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.248763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.248778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.248798 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.248812 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.350690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.350759 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.350778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.350804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.350823 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.453458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.453510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.453529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.453549 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.453560 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.557052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.557233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.557255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.557282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.557301 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.659866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.659904 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.659917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.659932 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.659943 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.762445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.762514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.762539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.762568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.762619 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.839854 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:26 crc kubenswrapper[4747]: E1205 20:43:26.840154 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.841510 4747 scope.go:117] "RemoveContainer" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.865316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.865386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.865408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.865435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.865458 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.969661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.970133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.970146 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.970166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:26 crc kubenswrapper[4747]: I1205 20:43:26.970254 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:26Z","lastTransitionTime":"2025-12-05T20:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.073539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.073630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.073648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.073672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.073690 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.176708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.176769 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.176792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.176821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.176844 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.279866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.279923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.279942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.279965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.279985 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.382761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.382819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.382838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.382862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.382878 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.486174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.486251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.486273 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.486305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.486329 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.589569 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.589657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.589685 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.589703 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.589714 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.693513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.693573 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.693617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.693644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.693663 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.796975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.797018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.797028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.797045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.797059 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.839739 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.839806 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.839775 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:27 crc kubenswrapper[4747]: E1205 20:43:27.839927 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:27 crc kubenswrapper[4747]: E1205 20:43:27.840106 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:27 crc kubenswrapper[4747]: E1205 20:43:27.840271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.900176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.900232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.900244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.900267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:27 crc kubenswrapper[4747]: I1205 20:43:27.900281 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:27Z","lastTransitionTime":"2025-12-05T20:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.003020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.003062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.003072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.003088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.003099 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.134508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.135171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.135194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.135215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.135229 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.237652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.237716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.237744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.237773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.237794 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.318032 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/2.log" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.321275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.321938 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.343003 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.345267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.345363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.345378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.345401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.345415 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.360748 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.380112 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.396965 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.413574 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.426980 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448552 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.448672 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.461539 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.478283 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.490257 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.507112 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.522344 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.536909 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.552008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.552090 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.552109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.552134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.552154 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.558083 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.572202 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.582717 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.594326 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.612487 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:28Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.654510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.654569 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.654611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.654633 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.654648 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.757271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.757307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.757320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.757337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.757349 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.839569 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:28 crc kubenswrapper[4747]: E1205 20:43:28.839812 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.860817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.860868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.860882 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.860904 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.860922 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.964006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.964071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.964098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.964130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:28 crc kubenswrapper[4747]: I1205 20:43:28.964154 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:28Z","lastTransitionTime":"2025-12-05T20:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.067205 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.067265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.067291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.067319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.067562 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.170421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.170511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.170533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.170566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.170622 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.273792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.273874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.273898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.273929 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.273952 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.327226 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/3.log" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.327748 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/2.log" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.330197 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" exitCode=1 Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.330234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.330269 4747 scope.go:117] "RemoveContainer" containerID="4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.330978 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:43:29 crc kubenswrapper[4747]: E1205 20:43:29.331142 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.351393 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.365149 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.376736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.376769 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.376780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.376796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.376806 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.382627 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.392488 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.421204 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.436207 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.452107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.466732 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.479030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.479077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.479092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.479112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.479125 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.482410 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.500065 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.515380 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.537868 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:28Z\\\",\\\"message\\\":\\\" 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:43:28.363060 6720 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:43:28.363133 6720 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:43:28.363158 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:43:28.363216 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:43:28.363232 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:43:28.363237 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:43:28.363246 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:43:28.363307 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:43:28.363321 6720 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:43:28.363375 6720 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:43:28.363397 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:43:28.363443 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:43:28.363472 6720 factory.go:656] Stopping watch factory\\\\nI1205 20:43:28.363613 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:43:28.363530 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:43:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.552058 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.565114 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.577743 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.581607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.581652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.581667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.581688 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.581701 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.590436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.601021 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.612173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.683748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.683784 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.683793 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.683806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.683814 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.786999 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.787081 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.787105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.787137 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.787161 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.839794 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.839885 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:29 crc kubenswrapper[4747]: E1205 20:43:29.840129 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.840225 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:29 crc kubenswrapper[4747]: E1205 20:43:29.840411 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:29 crc kubenswrapper[4747]: E1205 20:43:29.840534 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.859481 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.880899 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.890142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.890287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.890308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.890333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.890394 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.918176 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.935314 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.967816 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.981467 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994128 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:29Z","lastTransitionTime":"2025-12-05T20:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:29 crc kubenswrapper[4747]: I1205 20:43:29.994650 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:29Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.007625 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.022348 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.033530 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.048142 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.079866 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb21195a3e02c5b18dfc7ec0938f661b9afe9ff01da2526bfd84fbabcc7dd4e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:42:57Z\\\",\\\"message\\\":\\\":mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750337 6357 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.750937 6357 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:42:57.751014 6357 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-7lblw\\\\nI1205 20:42:57.7510\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:28Z\\\",\\\"message\\\":\\\" 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:43:28.363060 6720 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:43:28.363133 6720 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:43:28.363158 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:43:28.363216 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:43:28.363232 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:43:28.363237 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:43:28.363246 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:43:28.363307 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:43:28.363321 6720 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:43:28.363375 6720 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:43:28.363397 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:43:28.363443 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:43:28.363472 6720 factory.go:656] Stopping watch factory\\\\nI1205 20:43:28.363613 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:43:28.363530 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:43:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.096244 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.099373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.099425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.099441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.099465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.099479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.114281 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.129417 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.146313 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.161519 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.180788 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.202736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.202820 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.202844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.202878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.202900 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.240878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.240984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.241015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.241043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.241062 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.266443 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.273420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.273527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.273547 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.273670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.273691 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.294072 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.300488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.300536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.300548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.300565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.300597 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.316006 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.320505 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.320553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.320570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.320615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.320634 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.335359 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.336774 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/3.log" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.339917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.339982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.339995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.340010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.340022 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.340467 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.340680 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.353998 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56302b96-a896-482d-b012-96ecc8d111e7\\\",\\\"systemUUID\\\":\\\"78337cb5-96e9-4698-b089-53cf0e34a059\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.354468 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357295 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.357527 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dbfca38-f67c-4fe9-8ddc-91968241cb96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0522976ef766cb6000f9319c27956583b93806739df3751a7ed5c3a41622a342\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f58411b662e4ecb7dc98f386e073bd4a4e64d8aed10843e0db0f98c11f5a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c54373ee85013b735406d77d1ff5cfb1bbe2e2850a11fee14473795893a426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97fec1c79cc217aaa2f6be124433c7efb3bf9f5584b2773b0baeb747ef4ff0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.379734 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e9d6586-09af-4144-8e5d-01ad9fab33d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f9ef21727b510e0bbb7b8d8e3128ce5eda54b615cd047bbe7b3574ef117f836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97043a2137a85c2c6582cb6fe88dbffa7223b107dc34fc94a56ff6c13b9838d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90c044d942e49029c2a42a1b1285c78bc96a266a7fec00f5bd1eb64de3999daa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c4055d3666d576e9ee7ebe2596c4f7f7b5852c2660bbfa6b27681781d0251df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948d9622772454d68253cf12a900ac6eac1160c4096a934b9518d530e0b15ac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd7a038b9ad83d7377ae4bf3e23d0698a7c76357da58652585b04bd7a07ea7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://729d1510659c7cc9a839ff264781d68677fe5f507535d71261c0d5a2e4ebf4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zcn6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.393097 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dcr49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e860ee9-69f5-44a1-b414-deab4f78dd0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j44xn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dcr49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.417692 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6d0cee3-1e9f-4ba4-a1c9-8729b03f3d5a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a630644c5368eadaddb1d645be14c552075b285c4d91ef6b16b777fa5e9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aa4ea74d209fe95495dc3e2f454fe6e8402861e8e82b9dfd73fde7a0ee4db77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb68036702123836e864e568e40d8f4bacefb01c2578ff82a0569045524e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://673a0dd6b5f081e7db0f2d9bf047065fbb9f7a07afc46015ffddcc707b682393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc561d58c189785a016342733e84c02b52febd411506356a820bf8ce465ee37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff42697861fe54e287f4c06ddfcc84084cab93adafb9dbe0dc17c326514bff99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61927b52857a06fd6ad291ebf46b78c52a1026a2ac33560388748c55bdf97557\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc47c47daacfe140847cb3b9756933f3e2e975d129ae9ae290ad8cca2046d3e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.434350 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45352ea0-a197-4cb5-b623-7fde887ddba0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa2a47a58f8218d2d7f2a570fbe6235dd876056f783c4a0b7cd5806e945fbd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8837860f74bf82e0fdc185f0b43cee7c839d6c643998d8d4d1b6a6203fd605f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a11fe8848fee821ca99fcdf0c04542e027a02e46ba1dcaaa61a4f30497d8b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.451611 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.460036 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.460089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.460104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.460124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.460165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.469535 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37c49a8a8b481e22ef699d794883d60851afe43d5aaf0b36ce620b1dca72a5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdf6ca766e926ce5fb45ec6993b32b778c989c07593536c2c33c354aa02a5a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.490357 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaa31191-71db-4021-96c8-0080ef901122\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW1205 20:42:27.446600 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1205 20:42:27.446812 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:42:27.449024 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1603772465/tls.crt::/tmp/serving-cert-1603772465/tls.key\\\\\\\"\\\\nI1205 20:42:27.847921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:42:27.850248 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:42:27.850266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:42:27.850284 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:42:27.850289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:42:27.854888 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 20:42:27.854909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854914 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:42:27.854918 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:42:27.854921 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:42:27.854924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:42:27.854926 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 20:42:27.855075 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 20:42:27.856798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.508436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb339f156ff1e1bdbef9d0b244b8bc38d2c3501fe7529696f644d4d8e58acc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.528536 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.546377 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.563389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.563748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.563816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.563939 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.564019 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.563661 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85ba28a1-00e9-438e-9b47-6537f75121bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d57fd7f87d7dbc7a965bea457e92e7df37e45e8f5523c215433848cd2ff42b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4hrls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7lblw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.579212 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wbt7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92e0fc38-67be-4f7f-8fdb-187ce47fc0d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e966bcda353b0701a3c271efce6fcb2783c904ea2febbdde2a81c8974f695e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sw5kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wbt7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.598203 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14e1a6fd-3dc9-4ea2-b14f-afb176512c74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96b5b3e0cfe780147ee2a1267a467bebc75750757546d6775273ba912fb6fe0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e014d2bbaf3feedafca433102af5615176385e65d92196c9fde5a6bb45f4732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgkdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gqg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.620391 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de9cd47a97bdc83cf65c4902e6332ffbafafcc178a664f112e2be083881d5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.635718 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fql7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf47d5-9d28-4d4a-bbca-ad6c4bb9f3d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc40786766a124cf65051a1c67cf805518ad4c53152a780cf209ce57a737588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5rrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fql7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.657181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nm4fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f1e522-a732-4821-b7b0-6f1b6670c1d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:43:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:15Z\\\",\\\"message\\\":\\\"2025-12-05T20:42:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f\\\\n2025-12-05T20:42:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5caea8b1-fddc-44f3-baa0-e068adfa916f to /host/opt/cni/bin/\\\\n2025-12-05T20:42:30Z [verbose] multus-daemon started\\\\n2025-12-05T20:42:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:43:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:43:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5j2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nm4fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.667477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.667525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.667541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.667560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.667575 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.681773 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4881e707-c00c-4e49-8e30-a17719e80915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:42:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:43:28Z\\\",\\\"message\\\":\\\" 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:43:28.363060 6720 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:43:28.363133 6720 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:43:28.363158 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:43:28.363216 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:43:28.363232 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:43:28.363237 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:43:28.363246 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:43:28.363307 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:43:28.363321 6720 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:43:28.363375 6720 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:43:28.363397 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:43:28.363443 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:43:28.363472 6720 factory.go:656] Stopping watch factory\\\\nI1205 20:43:28.363613 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:43:28.363530 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:43:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:43:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6rn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:42:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kf4wd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:43:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.772664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.772747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.772797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.772834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.772859 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.839702 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:30 crc kubenswrapper[4747]: E1205 20:43:30.839922 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.876700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.876771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.876787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.876812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.876828 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.979896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.979972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.979996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.980031 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:30 crc kubenswrapper[4747]: I1205 20:43:30.980054 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:30Z","lastTransitionTime":"2025-12-05T20:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.082746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.082813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.082836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.082865 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.082886 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.186265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.186329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.186343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.186364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.186374 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.289183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.289676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.289701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.289733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.289758 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.392843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.392924 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.392947 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.392979 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.393003 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.495950 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.496004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.496023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.496049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.496071 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.599045 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.599096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.599113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.599139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.599158 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.701662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.701720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.701735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.701753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.701768 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.741635 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.741779 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.741845 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.741828052 +0000 UTC m=+146.209135550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.804984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.805026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.805037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.805054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.805066 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.839462 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.839673 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.839938 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.840001 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.840104 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.840319 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.842196 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.842374 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842403 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.842371439 +0000 UTC m=+146.309678967 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.842452 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.842525 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842559 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842626 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842653 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842765 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.842705297 +0000 UTC m=+146.310012875 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842873 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842903 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842924 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.842973 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.842958753 +0000 UTC m=+146.310266271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.843059 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: E1205 20:43:31.843135 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.843112907 +0000 UTC m=+146.310420405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.907464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.907519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.907542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.907572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:31 crc kubenswrapper[4747]: I1205 20:43:31.907639 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:31Z","lastTransitionTime":"2025-12-05T20:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.011260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.011331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.011347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.011371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.011387 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.114527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.114611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.114671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.114704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.114722 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.218079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.218134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.218154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.218176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.218194 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.320735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.320793 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.320810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.320832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.320850 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.423615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.423655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.423664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.423679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.423691 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.526443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.526536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.526560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.526622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.526660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.630148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.630204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.630239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.630273 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.630294 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.733370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.733431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.733446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.733469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.733485 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.836131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.836201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.836218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.836243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.836265 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.839437 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:32 crc kubenswrapper[4747]: E1205 20:43:32.839627 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.939244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.939421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.939467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.939540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:32 crc kubenswrapper[4747]: I1205 20:43:32.939566 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:32Z","lastTransitionTime":"2025-12-05T20:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.043196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.043252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.043271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.043294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.043311 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.146375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.146444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.146464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.146490 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.146510 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.249555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.249708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.249726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.249750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.249766 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.358147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.358202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.358250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.358280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.358302 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.461145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.461209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.461227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.461250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.461269 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.564359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.564451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.564477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.564914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.564934 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.667660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.667739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.667762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.667796 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.667824 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.771431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.771507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.771529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.771554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.771573 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.838916 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.838950 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.839232 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:33 crc kubenswrapper[4747]: E1205 20:43:33.839234 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:33 crc kubenswrapper[4747]: E1205 20:43:33.839705 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:33 crc kubenswrapper[4747]: E1205 20:43:33.839844 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.874776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.874829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.874847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.874867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.874884 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.977162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.977209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.977230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.977252 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:33 crc kubenswrapper[4747]: I1205 20:43:33.977266 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:33Z","lastTransitionTime":"2025-12-05T20:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.079910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.079967 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.079988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.080012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.080030 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.184342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.184425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.184451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.184478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.184500 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.288132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.288179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.288195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.288218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.288238 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.391640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.391725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.391757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.391786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.391807 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.494991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.495060 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.495087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.495120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.495145 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.598540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.598649 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.598669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.598697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.598715 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.701383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.701458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.701478 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.701504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.701522 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.804972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.805080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.805100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.805126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.805145 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.839055 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:34 crc kubenswrapper[4747]: E1205 20:43:34.839416 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.908763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.908835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.908852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.908877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:34 crc kubenswrapper[4747]: I1205 20:43:34.908893 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:34Z","lastTransitionTime":"2025-12-05T20:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.012653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.012718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.012735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.012761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.012778 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.115786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.115839 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.115856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.115883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.115900 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.219218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.219279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.219297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.219323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.219341 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.323172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.323249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.323272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.323300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.323319 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.425977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.426041 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.426061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.426090 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.426110 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.529464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.529576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.529630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.529656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.529673 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.632567 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.632664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.632686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.632713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.632730 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.736142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.736285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.736310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.736340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.736363 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.838902 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.838949 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:35 crc kubenswrapper[4747]: E1205 20:43:35.839067 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:35 crc kubenswrapper[4747]: E1205 20:43:35.839215 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839269 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:35 crc kubenswrapper[4747]: E1205 20:43:35.839382 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.839565 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.942784 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.942838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.942849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.942867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:35 crc kubenswrapper[4747]: I1205 20:43:35.942879 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:35Z","lastTransitionTime":"2025-12-05T20:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.046198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.046263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.046281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.046305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.046322 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.149226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.149291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.149358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.149389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.149415 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.252705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.252776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.252801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.252829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.252851 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.355337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.355364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.355374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.355387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.355395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.459325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.459374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.459389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.459413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.459426 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.562408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.562488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.562503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.562528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.562543 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.665518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.665630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.665651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.665678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.665699 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.769306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.769383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.769414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.769450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.769468 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.839036 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:36 crc kubenswrapper[4747]: E1205 20:43:36.839255 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.872057 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.872147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.872171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.872204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.872228 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.974794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.974836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.974850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.974870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:36 crc kubenswrapper[4747]: I1205 20:43:36.974886 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:36Z","lastTransitionTime":"2025-12-05T20:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.077641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.077720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.077745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.077772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.077794 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.181761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.181813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.181826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.181847 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.181862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.285500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.285562 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.285628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.285660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.285682 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.389332 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.389412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.389440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.389472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.389498 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.492808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.492911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.492938 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.492971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.492996 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.596261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.596321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.596338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.596360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.596377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.698750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.698791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.698802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.698818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.698829 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.801763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.801803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.801816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.801833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.801844 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.839024 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.839071 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:37 crc kubenswrapper[4747]: E1205 20:43:37.839207 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.839245 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:37 crc kubenswrapper[4747]: E1205 20:43:37.839399 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:37 crc kubenswrapper[4747]: E1205 20:43:37.839653 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.904299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.904354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.904366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.904388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:37 crc kubenswrapper[4747]: I1205 20:43:37.904400 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:37Z","lastTransitionTime":"2025-12-05T20:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.007923 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.007992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.008011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.008034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.008051 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.110892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.110975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.111004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.111038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.111066 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.214742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.214782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.214793 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.214812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.214827 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.318678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.318729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.318746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.318763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.318775 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.421535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.421574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.421609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.421629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.421640 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.524397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.524454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.524465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.524485 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.524497 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.628424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.628495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.628559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.628634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.628658 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.731154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.731221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.731246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.731274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.731296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.834305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.834388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.834410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.834440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.834463 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.839779 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:38 crc kubenswrapper[4747]: E1205 20:43:38.840003 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.937800 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.937859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.937877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.937901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:38 crc kubenswrapper[4747]: I1205 20:43:38.937920 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:38Z","lastTransitionTime":"2025-12-05T20:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.041511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.041619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.041755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.041791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.041811 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.145096 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.145150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.145165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.145185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.145200 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.247527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.247611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.247637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.247666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.247692 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.390352 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.390408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.390425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.390450 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.390468 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.493508 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.493629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.493646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.493696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.493820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.597800 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.598148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.598240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.598329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.598437 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.701726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.701780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.701794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.701818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.701830 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.805180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.805239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.805251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.805272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.805286 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.839875 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.839937 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.839997 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:39 crc kubenswrapper[4747]: E1205 20:43:39.840278 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:39 crc kubenswrapper[4747]: E1205 20:43:39.840424 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:39 crc kubenswrapper[4747]: E1205 20:43:39.840549 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.871946 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.871917875 podStartE2EDuration="1m12.871917875s" podCreationTimestamp="2025-12-05 20:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:39.871676189 +0000 UTC m=+90.338983727" watchObservedRunningTime="2025-12-05 20:43:39.871917875 +0000 UTC m=+90.339225393" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.909399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.909459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.909479 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.909502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.909519 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:39Z","lastTransitionTime":"2025-12-05T20:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.947504 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wbt7t" podStartSLOduration=71.947480817 podStartE2EDuration="1m11.947480817s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:39.94718064 +0000 UTC m=+90.414488148" watchObservedRunningTime="2025-12-05 20:43:39.947480817 +0000 UTC m=+90.414788305" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.963708 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gqg7" podStartSLOduration=70.963683253 podStartE2EDuration="1m10.963683253s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:39.963414736 +0000 UTC m=+90.430722224" watchObservedRunningTime="2025-12-05 20:43:39.963683253 +0000 UTC m=+90.430990741" Dec 05 20:43:39 crc kubenswrapper[4747]: I1205 20:43:39.990791 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fql7t" podStartSLOduration=72.990772199 podStartE2EDuration="1m12.990772199s" podCreationTimestamp="2025-12-05 20:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:39.990606125 +0000 UTC m=+90.457913633" watchObservedRunningTime="2025-12-05 20:43:39.990772199 +0000 UTC m=+90.458079687" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.018825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.018869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.018880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.018895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.018905 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.041096 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nm4fd" podStartSLOduration=72.041076819 podStartE2EDuration="1m12.041076819s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.006882063 +0000 UTC m=+90.474189561" watchObservedRunningTime="2025-12-05 20:43:40.041076819 +0000 UTC m=+90.508384307" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.068813 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podStartSLOduration=72.06879452 podStartE2EDuration="1m12.06879452s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.055303658 +0000 UTC m=+90.522611146" watchObservedRunningTime="2025-12-05 20:43:40.06879452 +0000 UTC m=+90.536102018" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.080355 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.080328865 podStartE2EDuration="37.080328865s" podCreationTimestamp="2025-12-05 20:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.068611265 +0000 UTC m=+90.535918763" watchObservedRunningTime="2025-12-05 20:43:40.080328865 +0000 UTC m=+90.547636353" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.103911 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.103896267 podStartE2EDuration="1m12.103896267s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.103764464 +0000 UTC m=+90.571071952" watchObservedRunningTime="2025-12-05 20:43:40.103896267 +0000 UTC m=+90.571203755" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.121056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.121120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.121131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.121145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.121155 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.134105 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.134084817 podStartE2EDuration="1m11.134084817s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.118915255 +0000 UTC m=+90.586222743" watchObservedRunningTime="2025-12-05 20:43:40.134084817 +0000 UTC m=+90.601392305" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.163558 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zcn6n" podStartSLOduration=72.163529479 podStartE2EDuration="1m12.163529479s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:40.162612567 +0000 UTC m=+90.629920055" watchObservedRunningTime="2025-12-05 20:43:40.163529479 +0000 UTC m=+90.630836967" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.223031 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.223069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.223079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.223092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.223103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.326419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.326488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.326518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.326564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.326650 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.429467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.429530 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.429541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.429560 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.429573 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.533367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.533424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.533443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.533467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.533484 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.637274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.637357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.637380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.637407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.637426 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.740281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.740342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.740359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.740384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.740403 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:43:40Z","lastTransitionTime":"2025-12-05T20:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.804146 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt"] Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.804948 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.808029 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.812736 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.812910 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.813485 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.839082 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:40 crc kubenswrapper[4747]: E1205 20:43:40.839256 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.947866 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.947917 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.947941 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.948055 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:40 crc kubenswrapper[4747]: I1205 20:43:40.948083 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049188 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049240 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049269 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049344 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049369 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049454 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.049502 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.050466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.062733 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.073477 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9488e8-91d7-4dc9-a333-e736c6a17fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bkhqt\" (UID: \"9d9488e8-91d7-4dc9-a333-e736c6a17fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.140766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" Dec 05 20:43:41 crc kubenswrapper[4747]: W1205 20:43:41.162244 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d9488e8_91d7_4dc9_a333_e736c6a17fd9.slice/crio-1027b5fa35983d300f6141beefe2e8336a2ee038584e0f0b6d5c2fc4a2a1eb08 WatchSource:0}: Error finding container 1027b5fa35983d300f6141beefe2e8336a2ee038584e0f0b6d5c2fc4a2a1eb08: Status 404 returned error can't find the container with id 1027b5fa35983d300f6141beefe2e8336a2ee038584e0f0b6d5c2fc4a2a1eb08 Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.404036 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" event={"ID":"9d9488e8-91d7-4dc9-a333-e736c6a17fd9","Type":"ContainerStarted","Data":"9fb410c103af9c948bfcafe1fb9fb7e7f0e16bf3031a3c9c024d8c3aa97b6dbd"} Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.404129 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" event={"ID":"9d9488e8-91d7-4dc9-a333-e736c6a17fd9","Type":"ContainerStarted","Data":"1027b5fa35983d300f6141beefe2e8336a2ee038584e0f0b6d5c2fc4a2a1eb08"} Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.417440 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bkhqt" podStartSLOduration=73.417414753 podStartE2EDuration="1m13.417414753s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:41.416705536 +0000 UTC m=+91.884013034" watchObservedRunningTime="2025-12-05 20:43:41.417414753 +0000 UTC m=+91.884722251" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.839286 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.839366 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:41 crc kubenswrapper[4747]: E1205 20:43:41.839922 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.839950 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:41 crc kubenswrapper[4747]: E1205 20:43:41.840105 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:41 crc kubenswrapper[4747]: E1205 20:43:41.840761 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:41 crc kubenswrapper[4747]: I1205 20:43:41.853897 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 20:43:42 crc kubenswrapper[4747]: I1205 20:43:42.838594 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:42 crc kubenswrapper[4747]: E1205 20:43:42.838723 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:43 crc kubenswrapper[4747]: I1205 20:43:43.839227 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:43 crc kubenswrapper[4747]: I1205 20:43:43.839347 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:43 crc kubenswrapper[4747]: I1205 20:43:43.839666 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:43 crc kubenswrapper[4747]: E1205 20:43:43.839834 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:43 crc kubenswrapper[4747]: E1205 20:43:43.839944 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:43 crc kubenswrapper[4747]: I1205 20:43:43.840188 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:43:43 crc kubenswrapper[4747]: E1205 20:43:43.840182 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:43 crc kubenswrapper[4747]: E1205 20:43:43.840487 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:43:44 crc kubenswrapper[4747]: I1205 20:43:44.838930 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:44 crc kubenswrapper[4747]: E1205 20:43:44.839079 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:45 crc kubenswrapper[4747]: I1205 20:43:45.838904 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:45 crc kubenswrapper[4747]: I1205 20:43:45.839067 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:45 crc kubenswrapper[4747]: I1205 20:43:45.839292 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:45 crc kubenswrapper[4747]: E1205 20:43:45.839271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:45 crc kubenswrapper[4747]: E1205 20:43:45.839467 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:45 crc kubenswrapper[4747]: E1205 20:43:45.839702 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:46 crc kubenswrapper[4747]: I1205 20:43:46.515386 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:46 crc kubenswrapper[4747]: E1205 20:43:46.515924 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:43:46 crc kubenswrapper[4747]: E1205 20:43:46.516076 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs podName:1e860ee9-69f5-44a1-b414-deab4f78dd0d nodeName:}" failed. No retries permitted until 2025-12-05 20:44:50.51603179 +0000 UTC m=+160.983339308 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs") pod "network-metrics-daemon-dcr49" (UID: "1e860ee9-69f5-44a1-b414-deab4f78dd0d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:43:46 crc kubenswrapper[4747]: I1205 20:43:46.839472 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:46 crc kubenswrapper[4747]: E1205 20:43:46.839689 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:47 crc kubenswrapper[4747]: I1205 20:43:47.838844 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:47 crc kubenswrapper[4747]: I1205 20:43:47.838980 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:47 crc kubenswrapper[4747]: E1205 20:43:47.839043 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:47 crc kubenswrapper[4747]: I1205 20:43:47.839162 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:47 crc kubenswrapper[4747]: E1205 20:43:47.839220 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:47 crc kubenswrapper[4747]: E1205 20:43:47.839417 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:48 crc kubenswrapper[4747]: I1205 20:43:48.839429 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:48 crc kubenswrapper[4747]: E1205 20:43:48.839633 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:49 crc kubenswrapper[4747]: I1205 20:43:49.838926 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:49 crc kubenswrapper[4747]: I1205 20:43:49.838975 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:49 crc kubenswrapper[4747]: E1205 20:43:49.841329 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:49 crc kubenswrapper[4747]: E1205 20:43:49.841756 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:49 crc kubenswrapper[4747]: I1205 20:43:49.841368 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:49 crc kubenswrapper[4747]: E1205 20:43:49.842901 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:49 crc kubenswrapper[4747]: I1205 20:43:49.859978 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.85995012 podStartE2EDuration="8.85995012s" podCreationTimestamp="2025-12-05 20:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:43:49.858535256 +0000 UTC m=+100.325842814" watchObservedRunningTime="2025-12-05 20:43:49.85995012 +0000 UTC m=+100.327257638" Dec 05 20:43:50 crc kubenswrapper[4747]: I1205 20:43:50.838767 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:50 crc kubenswrapper[4747]: E1205 20:43:50.839280 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:51 crc kubenswrapper[4747]: I1205 20:43:51.839181 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:51 crc kubenswrapper[4747]: I1205 20:43:51.839205 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:51 crc kubenswrapper[4747]: I1205 20:43:51.839625 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:51 crc kubenswrapper[4747]: E1205 20:43:51.839720 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:51 crc kubenswrapper[4747]: E1205 20:43:51.839940 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:51 crc kubenswrapper[4747]: E1205 20:43:51.840193 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:52 crc kubenswrapper[4747]: I1205 20:43:52.839536 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:52 crc kubenswrapper[4747]: E1205 20:43:52.839985 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:53 crc kubenswrapper[4747]: I1205 20:43:53.839681 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:53 crc kubenswrapper[4747]: I1205 20:43:53.839759 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:53 crc kubenswrapper[4747]: E1205 20:43:53.839825 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:53 crc kubenswrapper[4747]: E1205 20:43:53.839982 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:53 crc kubenswrapper[4747]: I1205 20:43:53.839699 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:53 crc kubenswrapper[4747]: E1205 20:43:53.840118 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:54 crc kubenswrapper[4747]: I1205 20:43:54.839494 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:54 crc kubenswrapper[4747]: E1205 20:43:54.839785 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:54 crc kubenswrapper[4747]: I1205 20:43:54.841166 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:43:54 crc kubenswrapper[4747]: E1205 20:43:54.841472 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kf4wd_openshift-ovn-kubernetes(4881e707-c00c-4e49-8e30-a17719e80915)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" Dec 05 20:43:55 crc kubenswrapper[4747]: I1205 20:43:55.839682 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:55 crc kubenswrapper[4747]: I1205 20:43:55.839780 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:55 crc kubenswrapper[4747]: E1205 20:43:55.839990 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:55 crc kubenswrapper[4747]: E1205 20:43:55.840085 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:55 crc kubenswrapper[4747]: I1205 20:43:55.839744 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:55 crc kubenswrapper[4747]: E1205 20:43:55.840938 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:56 crc kubenswrapper[4747]: I1205 20:43:56.839443 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:56 crc kubenswrapper[4747]: E1205 20:43:56.839716 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:57 crc kubenswrapper[4747]: I1205 20:43:57.839344 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:57 crc kubenswrapper[4747]: I1205 20:43:57.839385 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:57 crc kubenswrapper[4747]: E1205 20:43:57.839481 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:57 crc kubenswrapper[4747]: E1205 20:43:57.839671 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:57 crc kubenswrapper[4747]: I1205 20:43:57.840660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:57 crc kubenswrapper[4747]: E1205 20:43:57.840958 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:43:58 crc kubenswrapper[4747]: I1205 20:43:58.838853 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:43:58 crc kubenswrapper[4747]: E1205 20:43:58.839290 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:43:59 crc kubenswrapper[4747]: I1205 20:43:59.839487 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:43:59 crc kubenswrapper[4747]: I1205 20:43:59.839556 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:43:59 crc kubenswrapper[4747]: E1205 20:43:59.840861 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:43:59 crc kubenswrapper[4747]: I1205 20:43:59.841093 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:43:59 crc kubenswrapper[4747]: E1205 20:43:59.841161 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:43:59 crc kubenswrapper[4747]: E1205 20:43:59.841314 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:00 crc kubenswrapper[4747]: I1205 20:44:00.839748 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:00 crc kubenswrapper[4747]: E1205 20:44:00.839939 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:01 crc kubenswrapper[4747]: I1205 20:44:01.839752 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:01 crc kubenswrapper[4747]: I1205 20:44:01.839859 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:01 crc kubenswrapper[4747]: E1205 20:44:01.839956 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:01 crc kubenswrapper[4747]: I1205 20:44:01.839889 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:01 crc kubenswrapper[4747]: E1205 20:44:01.840179 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:01 crc kubenswrapper[4747]: E1205 20:44:01.840319 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.478895 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/1.log" Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.480878 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/0.log" Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.480943 4747 generic.go:334] "Generic (PLEG): container finished" podID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" containerID="a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5" exitCode=1 Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.480974 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerDied","Data":"a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5"} Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.481018 4747 scope.go:117] "RemoveContainer" containerID="b2d23f881feafa04e51af875149a61cb8ac691236339c0150a2715c9e057c22d" Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.481622 4747 scope.go:117] "RemoveContainer" containerID="a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5" Dec 05 20:44:02 crc kubenswrapper[4747]: E1205 20:44:02.481895 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nm4fd_openshift-multus(53f1e522-a732-4821-b7b0-6f1b6670c1d4)\"" pod="openshift-multus/multus-nm4fd" podUID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" Dec 05 20:44:02 crc kubenswrapper[4747]: I1205 20:44:02.839026 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:02 crc kubenswrapper[4747]: E1205 20:44:02.839376 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:03 crc kubenswrapper[4747]: I1205 20:44:03.485843 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/1.log" Dec 05 20:44:03 crc kubenswrapper[4747]: I1205 20:44:03.839190 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:03 crc kubenswrapper[4747]: I1205 20:44:03.839266 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:03 crc kubenswrapper[4747]: E1205 20:44:03.839350 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:03 crc kubenswrapper[4747]: I1205 20:44:03.839487 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:03 crc kubenswrapper[4747]: E1205 20:44:03.839541 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:03 crc kubenswrapper[4747]: E1205 20:44:03.839718 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:04 crc kubenswrapper[4747]: I1205 20:44:04.839743 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:04 crc kubenswrapper[4747]: E1205 20:44:04.839979 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:05 crc kubenswrapper[4747]: I1205 20:44:05.839503 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:05 crc kubenswrapper[4747]: I1205 20:44:05.839628 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:05 crc kubenswrapper[4747]: E1205 20:44:05.839759 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:05 crc kubenswrapper[4747]: I1205 20:44:05.839801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:05 crc kubenswrapper[4747]: E1205 20:44:05.839948 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:05 crc kubenswrapper[4747]: E1205 20:44:05.840146 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:06 crc kubenswrapper[4747]: I1205 20:44:06.839498 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:06 crc kubenswrapper[4747]: E1205 20:44:06.839758 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:07 crc kubenswrapper[4747]: I1205 20:44:07.838850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:07 crc kubenswrapper[4747]: E1205 20:44:07.839027 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:07 crc kubenswrapper[4747]: I1205 20:44:07.839115 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:07 crc kubenswrapper[4747]: I1205 20:44:07.839153 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:07 crc kubenswrapper[4747]: E1205 20:44:07.839498 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:07 crc kubenswrapper[4747]: E1205 20:44:07.839741 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:08 crc kubenswrapper[4747]: I1205 20:44:08.838863 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:08 crc kubenswrapper[4747]: E1205 20:44:08.839065 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:08 crc kubenswrapper[4747]: I1205 20:44:08.840141 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.510337 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/3.log" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.512909 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerStarted","Data":"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab"} Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.513733 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.551219 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podStartSLOduration=101.551196966 podStartE2EDuration="1m41.551196966s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:09.550735885 +0000 UTC m=+120.018043413" watchObservedRunningTime="2025-12-05 20:44:09.551196966 +0000 UTC m=+120.018504474" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.674709 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dcr49"] Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.674833 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:09 crc kubenswrapper[4747]: E1205 20:44:09.674943 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:09 crc kubenswrapper[4747]: E1205 20:44:09.800265 4747 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.839492 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:09 crc kubenswrapper[4747]: I1205 20:44:09.839485 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:09 crc kubenswrapper[4747]: E1205 20:44:09.840755 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:09 crc kubenswrapper[4747]: E1205 20:44:09.840865 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:09 crc kubenswrapper[4747]: E1205 20:44:09.948379 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:44:10 crc kubenswrapper[4747]: I1205 20:44:10.839065 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:10 crc kubenswrapper[4747]: I1205 20:44:10.839123 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:10 crc kubenswrapper[4747]: E1205 20:44:10.839257 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:10 crc kubenswrapper[4747]: E1205 20:44:10.839364 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:11 crc kubenswrapper[4747]: I1205 20:44:11.839349 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:11 crc kubenswrapper[4747]: I1205 20:44:11.839392 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:11 crc kubenswrapper[4747]: E1205 20:44:11.839511 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:11 crc kubenswrapper[4747]: E1205 20:44:11.839701 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:12 crc kubenswrapper[4747]: I1205 20:44:12.839008 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:12 crc kubenswrapper[4747]: I1205 20:44:12.839063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:12 crc kubenswrapper[4747]: E1205 20:44:12.839165 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:12 crc kubenswrapper[4747]: E1205 20:44:12.839351 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:13 crc kubenswrapper[4747]: I1205 20:44:13.839797 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:13 crc kubenswrapper[4747]: E1205 20:44:13.840019 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:13 crc kubenswrapper[4747]: I1205 20:44:13.840879 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:13 crc kubenswrapper[4747]: E1205 20:44:13.841100 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:14 crc kubenswrapper[4747]: I1205 20:44:14.839093 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:14 crc kubenswrapper[4747]: I1205 20:44:14.839099 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:14 crc kubenswrapper[4747]: E1205 20:44:14.839396 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:14 crc kubenswrapper[4747]: E1205 20:44:14.839729 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:14 crc kubenswrapper[4747]: E1205 20:44:14.949679 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:44:15 crc kubenswrapper[4747]: I1205 20:44:15.462315 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:44:15 crc kubenswrapper[4747]: I1205 20:44:15.838740 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:15 crc kubenswrapper[4747]: I1205 20:44:15.838845 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:15 crc kubenswrapper[4747]: E1205 20:44:15.839496 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:15 crc kubenswrapper[4747]: E1205 20:44:15.840020 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:16 crc kubenswrapper[4747]: I1205 20:44:16.839534 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:16 crc kubenswrapper[4747]: I1205 20:44:16.839621 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:16 crc kubenswrapper[4747]: E1205 20:44:16.839773 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:16 crc kubenswrapper[4747]: E1205 20:44:16.839920 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:17 crc kubenswrapper[4747]: I1205 20:44:17.838984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:17 crc kubenswrapper[4747]: I1205 20:44:17.839359 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:17 crc kubenswrapper[4747]: E1205 20:44:17.839624 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:17 crc kubenswrapper[4747]: E1205 20:44:17.839658 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:17 crc kubenswrapper[4747]: I1205 20:44:17.840903 4747 scope.go:117] "RemoveContainer" containerID="a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5" Dec 05 20:44:18 crc kubenswrapper[4747]: I1205 20:44:18.559289 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/1.log" Dec 05 20:44:18 crc kubenswrapper[4747]: I1205 20:44:18.559609 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerStarted","Data":"68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355"} Dec 05 20:44:18 crc kubenswrapper[4747]: I1205 20:44:18.838861 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:18 crc kubenswrapper[4747]: E1205 20:44:18.839098 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:18 crc kubenswrapper[4747]: I1205 20:44:18.839455 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:18 crc kubenswrapper[4747]: E1205 20:44:18.839631 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:19 crc kubenswrapper[4747]: I1205 20:44:19.839001 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:19 crc kubenswrapper[4747]: E1205 20:44:19.840670 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:19 crc kubenswrapper[4747]: I1205 20:44:19.840719 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:19 crc kubenswrapper[4747]: E1205 20:44:19.840894 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:19 crc kubenswrapper[4747]: E1205 20:44:19.951278 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:44:20 crc kubenswrapper[4747]: I1205 20:44:20.838734 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:20 crc kubenswrapper[4747]: I1205 20:44:20.838765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:20 crc kubenswrapper[4747]: E1205 20:44:20.838965 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:20 crc kubenswrapper[4747]: E1205 20:44:20.839005 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:21 crc kubenswrapper[4747]: I1205 20:44:21.839831 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:21 crc kubenswrapper[4747]: I1205 20:44:21.839922 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:21 crc kubenswrapper[4747]: E1205 20:44:21.840022 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:21 crc kubenswrapper[4747]: E1205 20:44:21.840110 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:22 crc kubenswrapper[4747]: I1205 20:44:22.839625 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:22 crc kubenswrapper[4747]: I1205 20:44:22.839632 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:22 crc kubenswrapper[4747]: E1205 20:44:22.839807 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:22 crc kubenswrapper[4747]: E1205 20:44:22.839934 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:23 crc kubenswrapper[4747]: I1205 20:44:23.838900 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:23 crc kubenswrapper[4747]: E1205 20:44:23.839060 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:44:23 crc kubenswrapper[4747]: I1205 20:44:23.839100 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:23 crc kubenswrapper[4747]: E1205 20:44:23.839323 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:44:24 crc kubenswrapper[4747]: I1205 20:44:24.839699 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:24 crc kubenswrapper[4747]: I1205 20:44:24.839801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:24 crc kubenswrapper[4747]: E1205 20:44:24.839921 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:44:24 crc kubenswrapper[4747]: E1205 20:44:24.840007 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dcr49" podUID="1e860ee9-69f5-44a1-b414-deab4f78dd0d" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.839619 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.840105 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.843336 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.844417 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.844825 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:44:25 crc kubenswrapper[4747]: I1205 20:44:25.844990 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:44:26 crc kubenswrapper[4747]: I1205 20:44:26.839518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:26 crc kubenswrapper[4747]: I1205 20:44:26.839656 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:26 crc kubenswrapper[4747]: I1205 20:44:26.843004 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:44:26 crc kubenswrapper[4747]: I1205 20:44:26.843620 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.341144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.395543 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.396436 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.396948 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.397747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.398237 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.398818 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.416652 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.417962 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.418107 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.418568 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8pt2"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.419317 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.421739 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.422739 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.423801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.424245 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.424770 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cdhgg"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.425778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.438297 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.439446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.441057 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hrsvp"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.450340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.450532 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.452541 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.452099 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.454730 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.454902 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.455326 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.455466 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.455764 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.455930 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456116 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456129 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456268 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456431 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456294 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456704 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456781 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.456897 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457007 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457672 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457800 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457881 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457883 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.457942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458006 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458066 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458531 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458678 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458775 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.458861 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.459041 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.459329 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.459708 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.460149 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.460380 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.460532 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.460653 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.460784 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.461180 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.461328 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463147 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463320 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463507 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463532 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fkw6v"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463803 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.463936 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464039 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464149 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464451 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464623 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464711 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.464739 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.465476 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.466184 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.467482 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.467538 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.467502 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.467742 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.467854 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.468036 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.469382 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.469916 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.472404 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.472981 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.474196 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.474524 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.475961 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.476374 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cdhgg"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.480284 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fkw6v"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.483722 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hrsvp"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.484016 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.489658 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.490316 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.490454 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.491534 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.491760 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.491857 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.491981 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492123 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492368 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.491997 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492308 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492607 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8pt2"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492756 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492857 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492886 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.492944 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.493239 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.495820 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.496482 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.498116 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.498564 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.500703 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.500925 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.501642 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.502972 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.509877 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.510394 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.513470 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.515378 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.517871 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518269 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-encryption-config\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518304 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518347 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518364 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518385 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727707b0-2824-4736-afed-38b1efdb98de-serving-cert\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518401 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-dir\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.518420 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519234 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66dbz\" (UniqueName: \"kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519281 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519307 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519330 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-serving-cert\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519371 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-client\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqpf\" (UniqueName: \"kubernetes.io/projected/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-kube-api-access-ptqpf\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.519485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520363 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb8mz\" (UniqueName: \"kubernetes.io/projected/727707b0-2824-4736-afed-38b1efdb98de-kube-api-access-hb8mz\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520386 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520403 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5616bfbc-527c-4c99-b78e-7568c26ca4bb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520425 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520442 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-encryption-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520459 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520478 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5616bfbc-527c-4c99-b78e-7568c26ca4bb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-client\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520514 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-machine-approver-tls\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520520 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520556 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520529 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-node-pullsecrets\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520616 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520642 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520668 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dssq2\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-kube-api-access-dssq2\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520688 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520741 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520792 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpgm\" (UniqueName: \"kubernetes.io/projected/82d08f0b-8d21-4b31-b88f-44d5578c1f03-kube-api-access-zvpgm\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520825 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520854 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvm6\" (UniqueName: \"kubernetes.io/projected/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-kube-api-access-vcvm6\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1840e48c-ee91-48d2-8ddb-34f24dde58ff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520900 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520927 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-auth-proxy-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520953 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8b9c\" (UniqueName: \"kubernetes.io/projected/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-kube-api-access-d8b9c\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.520988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rdj\" (UniqueName: \"kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-policies\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521102 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxplv\" (UniqueName: \"kubernetes.io/projected/3ee658b1-3098-41d3-89c1-eec71d92d82e-kube-api-access-hxplv\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521175 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-config\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521215 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521239 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-images\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521257 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwn9\" (UniqueName: \"kubernetes.io/projected/1840e48c-ee91-48d2-8ddb-34f24dde58ff-kube-api-access-btwn9\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521274 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521297 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521316 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6698\" (UniqueName: \"kubernetes.io/projected/4d7ce274-8069-41a9-949f-a2e493c170a5-kube-api-access-w6698\") pod \"downloads-7954f5f757-hrsvp\" (UID: \"4d7ce274-8069-41a9-949f-a2e493c170a5\") " pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-trusted-ca\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521350 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521367 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521400 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521428 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-config\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521445 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521463 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq2fm\" (UniqueName: \"kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521484 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-image-import-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521525 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-serving-cert\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521595 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit-dir\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521617 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521643 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prmgd\" (UniqueName: \"kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.521663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.522993 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622161 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-auth-proxy-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622202 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622248 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rdj\" (UniqueName: \"kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622283 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8b9c\" (UniqueName: \"kubernetes.io/projected/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-kube-api-access-d8b9c\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622298 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-policies\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622384 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622400 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxplv\" (UniqueName: \"kubernetes.io/projected/3ee658b1-3098-41d3-89c1-eec71d92d82e-kube-api-access-hxplv\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622420 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622436 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-config\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622451 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622467 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-images\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622484 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwn9\" (UniqueName: \"kubernetes.io/projected/1840e48c-ee91-48d2-8ddb-34f24dde58ff-kube-api-access-btwn9\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622501 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622533 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6698\" (UniqueName: \"kubernetes.io/projected/4d7ce274-8069-41a9-949f-a2e493c170a5-kube-api-access-w6698\") pod \"downloads-7954f5f757-hrsvp\" (UID: \"4d7ce274-8069-41a9-949f-a2e493c170a5\") " pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622550 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622622 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622638 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-trusted-ca\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-config\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622700 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq2fm\" (UniqueName: \"kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622718 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622750 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-image-import-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622778 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622798 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-serving-cert\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622821 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit-dir\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-auth-proxy-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622843 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prmgd\" (UniqueName: \"kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622929 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-encryption-config\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622972 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.622990 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623043 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727707b0-2824-4736-afed-38b1efdb98de-serving-cert\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623058 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-dir\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623075 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623091 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66dbz\" (UniqueName: \"kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-serving-cert\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623127 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-client\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623157 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623172 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623187 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623203 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqpf\" (UniqueName: \"kubernetes.io/projected/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-kube-api-access-ptqpf\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623220 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb8mz\" (UniqueName: \"kubernetes.io/projected/727707b0-2824-4736-afed-38b1efdb98de-kube-api-access-hb8mz\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623236 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623251 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-encryption-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623286 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623304 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5616bfbc-527c-4c99-b78e-7568c26ca4bb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623322 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5616bfbc-527c-4c99-b78e-7568c26ca4bb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623362 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-machine-approver-tls\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623380 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-node-pullsecrets\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-client\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623431 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623453 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dssq2\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-kube-api-access-dssq2\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623469 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623484 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpgm\" (UniqueName: \"kubernetes.io/projected/82d08f0b-8d21-4b31-b88f-44d5578c1f03-kube-api-access-zvpgm\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623560 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623592 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1840e48c-ee91-48d2-8ddb-34f24dde58ff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623610 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvm6\" (UniqueName: \"kubernetes.io/projected/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-kube-api-access-vcvm6\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.623628 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.625442 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.625899 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.625961 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.626054 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.626104 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-config\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.626545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.626715 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.627521 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.627649 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.628562 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-image-import-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.628712 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-trusted-ca\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.629070 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-serving-ca\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.629118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit-dir\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.630119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.656514 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-config\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.656600 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.657195 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.657389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.657417 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ee658b1-3098-41d3-89c1-eec71d92d82e-node-pullsecrets\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.657742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.658214 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-machine-approver-tls\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.658541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.659017 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5616bfbc-527c-4c99-b78e-7568c26ca4bb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.630380 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1840e48c-ee91-48d2-8ddb-34f24dde58ff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.665861 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.666176 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-serving-cert\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.666342 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5616bfbc-527c-4c99-b78e-7568c26ca4bb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.666706 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.667192 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.667758 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.668017 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.668438 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.669241 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.671067 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-serving-cert\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.671205 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.671556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.672135 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-policies\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.672200 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82d08f0b-8d21-4b31-b88f-44d5578c1f03-audit-dir\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.672358 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.677615 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727707b0-2824-4736-afed-38b1efdb98de-config\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.678049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82d08f0b-8d21-4b31-b88f-44d5578c1f03-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.679202 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1840e48c-ee91-48d2-8ddb-34f24dde58ff-images\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.680015 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.681194 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.681686 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.683291 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ee658b1-3098-41d3-89c1-eec71d92d82e-audit\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.702382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-encryption-config\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.705480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqpf\" (UniqueName: \"kubernetes.io/projected/0531af2e-7c38-4f8e-8ca3-0c4ac5148059-kube-api-access-ptqpf\") pod \"openshift-config-operator-7777fb866f-f8hsv\" (UID: \"0531af2e-7c38-4f8e-8ca3-0c4ac5148059\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.706023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb8mz\" (UniqueName: \"kubernetes.io/projected/727707b0-2824-4736-afed-38b1efdb98de-kube-api-access-hb8mz\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.708016 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.709024 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rdj\" (UniqueName: \"kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.710034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dssq2\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-kube-api-access-dssq2\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.711961 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.714836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-etcd-client\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.715125 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.715149 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66dbz\" (UniqueName: \"kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz\") pod \"console-f9d7485db-rb4b9\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.715824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prmgd\" (UniqueName: \"kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd\") pod \"route-controller-manager-6576b87f9c-l4dc5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.715913 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpgm\" (UniqueName: \"kubernetes.io/projected/82d08f0b-8d21-4b31-b88f-44d5578c1f03-kube-api-access-zvpgm\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.716215 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.716246 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5616bfbc-527c-4c99-b78e-7568c26ca4bb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cxdgd\" (UID: \"5616bfbc-527c-4c99-b78e-7568c26ca4bb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.718978 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgkmr"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.721329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.721501 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727707b0-2824-4736-afed-38b1efdb98de-serving-cert\") pod \"console-operator-58897d9998-fkw6v\" (UID: \"727707b0-2824-4736-afed-38b1efdb98de\") " pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.722241 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ee658b1-3098-41d3-89c1-eec71d92d82e-etcd-client\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.722279 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.722768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq2fm\" (UniqueName: \"kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm\") pod \"controller-manager-879f6c89f-sntqs\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.723619 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvm6\" (UniqueName: \"kubernetes.io/projected/1aa46678-7bb8-4017-b1c2-b61dd951ffd4-kube-api-access-vcvm6\") pod \"cluster-samples-operator-665b6dd947-h25fx\" (UID: \"1aa46678-7bb8-4017-b1c2-b61dd951ffd4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.724148 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xx4lj"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.724209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.725218 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-serving-cert\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.739760 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.740868 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6698\" (UniqueName: \"kubernetes.io/projected/4d7ce274-8069-41a9-949f-a2e493c170a5-kube-api-access-w6698\") pod \"downloads-7954f5f757-hrsvp\" (UID: \"4d7ce274-8069-41a9-949f-a2e493c170a5\") " pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.741326 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82d08f0b-8d21-4b31-b88f-44d5578c1f03-encryption-config\") pod \"apiserver-7bbb656c7d-hsm2r\" (UID: \"82d08f0b-8d21-4b31-b88f-44d5578c1f03\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.741404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qt2mg\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.741472 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jq4j"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.742056 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.742169 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwn9\" (UniqueName: \"kubernetes.io/projected/1840e48c-ee91-48d2-8ddb-34f24dde58ff-kube-api-access-btwn9\") pod \"machine-api-operator-5694c8668f-cdhgg\" (UID: \"1840e48c-ee91-48d2-8ddb-34f24dde58ff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.743880 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.744594 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.747232 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.747866 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.748568 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.749006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.751695 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d88cp"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.752300 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.753453 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.760684 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.762079 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.762336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.763824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8b9c\" (UniqueName: \"kubernetes.io/projected/d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a-kube-api-access-d8b9c\") pod \"machine-approver-56656f9798-pldz6\" (UID: \"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.764230 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.764614 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.765056 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.772504 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2p8p6"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.775681 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.776767 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.780635 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.781233 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.781650 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.782122 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.782360 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.782490 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.785459 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.785660 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.789186 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.789463 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.789745 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.790070 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.790295 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.790430 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.790544 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.795335 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxplv\" (UniqueName: \"kubernetes.io/projected/3ee658b1-3098-41d3-89c1-eec71d92d82e-kube-api-access-hxplv\") pod \"apiserver-76f77b778f-f8pt2\" (UID: \"3ee658b1-3098-41d3-89c1-eec71d92d82e\") " pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.795952 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.798426 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-75r5s"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.799126 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.800775 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgkmr"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.802497 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.803111 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.803316 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.806205 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v44jl"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.806979 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.809153 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.809868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.813138 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-glgf7"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.813566 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.813976 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.817104 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.818140 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.820833 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.821471 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mmqsc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.822172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.824175 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.833702 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ddtpz"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.834813 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.835429 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.836076 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-drfsj"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.836192 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.839259 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.841700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.842669 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.844838 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a225e30-e0a1-4eab-8172-890abdc72200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.844873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.844905 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-stats-auth\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.844981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225e30-e0a1-4eab-8172-890abdc72200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845024 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a225e30-e0a1-4eab-8172-890abdc72200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845051 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c77cec54-eef5-4a5b-acc7-395a55ec4be1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845119 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1caa95ad-0f17-4c3c-a336-2398afe5db0e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845145 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-metrics-certs\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845173 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845206 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845235 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tbtb\" (UniqueName: \"kubernetes.io/projected/506d2ee8-fd84-4d9d-8d72-7253e237012a-kube-api-access-2tbtb\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845302 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6mx\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-kube-api-access-tx6mx\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwkr4\" (UniqueName: \"kubernetes.io/projected/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-kube-api-access-kwkr4\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845354 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845382 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-metrics-tls\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845456 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrs9w\" (UniqueName: \"kubernetes.io/projected/b672eb13-d702-4613-90d2-5d2406923876-kube-api-access-mrs9w\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845493 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506d2ee8-fd84-4d9d-8d72-7253e237012a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845523 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcrc\" (UniqueName: \"kubernetes.io/projected/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-kube-api-access-bxcrc\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845552 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845659 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-serving-cert\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.845888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1caa95ad-0f17-4c3c-a336-2398afe5db0e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846089 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b672eb13-d702-4613-90d2-5d2406923876-metrics-tls\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846146 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvdh\" (UniqueName: \"kubernetes.io/projected/1caa95ad-0f17-4c3c-a336-2398afe5db0e-kube-api-access-2rvdh\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846205 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846441 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdf9f\" (UniqueName: \"kubernetes.io/projected/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-kube-api-access-mdf9f\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846472 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846591 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d2ee8-fd84-4d9d-8d72-7253e237012a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846625 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-service-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846655 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-config-volume\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846705 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-client\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846728 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-config\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-config\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846787 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-serving-cert\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92pd\" (UniqueName: \"kubernetes.io/projected/699936b9-cebb-4705-8e8e-abfeae4705f6-kube-api-access-q92pd\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846840 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlt5s\" (UniqueName: \"kubernetes.io/projected/72d9939d-e77a-469d-9c50-32bc6bfdf498-kube-api-access-mlt5s\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846936 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-default-certificate\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.846986 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lzhm\" (UniqueName: \"kubernetes.io/projected/c77cec54-eef5-4a5b-acc7-395a55ec4be1-kube-api-access-4lzhm\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.847024 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.847116 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.850662 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.859333 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.863911 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.864343 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.871843 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.881991 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.886690 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887263 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xx4lj"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887381 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887430 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887449 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jq4j"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887470 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d88cp"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887486 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mmqsc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887504 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887522 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887618 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887698 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887725 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887963 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v44jl"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.887980 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.889640 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.893245 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.894841 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.896559 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.897795 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.898895 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.899131 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-glgf7"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.908961 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.916463 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.918341 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-75r5s"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.920770 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.924574 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.924766 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.925496 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.926638 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ddtpz"] Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.940593 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlt5s\" (UniqueName: \"kubernetes.io/projected/72d9939d-e77a-469d-9c50-32bc6bfdf498-kube-api-access-mlt5s\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-config\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-serving-cert\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948813 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92pd\" (UniqueName: \"kubernetes.io/projected/699936b9-cebb-4705-8e8e-abfeae4705f6-kube-api-access-q92pd\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-default-certificate\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948858 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lzhm\" (UniqueName: \"kubernetes.io/projected/c77cec54-eef5-4a5b-acc7-395a55ec4be1-kube-api-access-4lzhm\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948880 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948895 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948913 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-stats-auth\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948930 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a225e30-e0a1-4eab-8172-890abdc72200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948944 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948960 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225e30-e0a1-4eab-8172-890abdc72200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.948979 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a225e30-e0a1-4eab-8172-890abdc72200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949000 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-metrics-certs\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c77cec54-eef5-4a5b-acc7-395a55ec4be1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949034 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1caa95ad-0f17-4c3c-a336-2398afe5db0e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tbtb\" (UniqueName: \"kubernetes.io/projected/506d2ee8-fd84-4d9d-8d72-7253e237012a-kube-api-access-2tbtb\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949084 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6mx\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-kube-api-access-tx6mx\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwkr4\" (UniqueName: \"kubernetes.io/projected/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-kube-api-access-kwkr4\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949135 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-metrics-tls\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcrc\" (UniqueName: \"kubernetes.io/projected/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-kube-api-access-bxcrc\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrs9w\" (UniqueName: \"kubernetes.io/projected/b672eb13-d702-4613-90d2-5d2406923876-kube-api-access-mrs9w\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949208 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506d2ee8-fd84-4d9d-8d72-7253e237012a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-serving-cert\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.950396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.951690 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c77cec54-eef5-4a5b-acc7-395a55ec4be1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.949274 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.951910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1caa95ad-0f17-4c3c-a336-2398afe5db0e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.951952 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b672eb13-d702-4613-90d2-5d2406923876-metrics-tls\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.951954 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.951969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvdh\" (UniqueName: \"kubernetes.io/projected/1caa95ad-0f17-4c3c-a336-2398afe5db0e-kube-api-access-2rvdh\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952039 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdf9f\" (UniqueName: \"kubernetes.io/projected/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-kube-api-access-mdf9f\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952088 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952120 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d2ee8-fd84-4d9d-8d72-7253e237012a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952136 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-service-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952164 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-config-volume\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-client\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-config\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.952843 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d2ee8-fd84-4d9d-8d72-7253e237012a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.953115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-config\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.953118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.960667 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.964116 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506d2ee8-fd84-4d9d-8d72-7253e237012a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:31 crc kubenswrapper[4747]: I1205 20:44:31.979405 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.000068 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.009735 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-serving-cert\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.019347 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.042205 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.043791 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-config-volume\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.060436 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.082884 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.099974 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.101185 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-metrics-tls\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.103415 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.115591 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.118629 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.138503 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.147272 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-serving-cert\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.162224 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.164999 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-config\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.183432 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.198614 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.205191 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-client\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.219389 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.225835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/72d9939d-e77a-469d-9c50-32bc6bfdf498-etcd-service-ca\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.238482 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.239759 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.248931 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1018779a_353b_4530_84e1_b52a044d69d5.slice/crio-1fec1e715976632c6c419ba3b2ddb58dbf9d0a3eb43230f2da12e4f196c82757 WatchSource:0}: Error finding container 1fec1e715976632c6c419ba3b2ddb58dbf9d0a3eb43230f2da12e4f196c82757: Status 404 returned error can't find the container with id 1fec1e715976632c6c419ba3b2ddb58dbf9d0a3eb43230f2da12e4f196c82757 Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.260837 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.279877 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.298563 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.325290 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.337968 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cdhgg"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.342395 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.343319 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.360508 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.368974 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.373154 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.381495 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hrsvp"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.381778 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.388175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.409126 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.411357 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0e9905_637c_4a71_b816_12fefbb801d2.slice/crio-54d8f408ddc8b683cc555741bb53001f9eaf42b542ac78598e17aae65e55c3d3 WatchSource:0}: Error finding container 54d8f408ddc8b683cc555741bb53001f9eaf42b542ac78598e17aae65e55c3d3: Status 404 returned error can't find the container with id 54d8f408ddc8b683cc555741bb53001f9eaf42b542ac78598e17aae65e55c3d3 Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.413431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.417503 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7ce274_8069_41a9_949f_a2e493c170a5.slice/crio-6453e640f3d61c20c6f58f85460ff8958f36eee87b01ab6c57818b1a644436ab WatchSource:0}: Error finding container 6453e640f3d61c20c6f58f85460ff8958f36eee87b01ab6c57818b1a644436ab: Status 404 returned error can't find the container with id 6453e640f3d61c20c6f58f85460ff8958f36eee87b01ab6c57818b1a644436ab Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.418313 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.425512 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fkw6v"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.440067 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.458762 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.459981 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.464708 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd"] Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.465449 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727707b0_2824_4736_afed_38b1efdb98de.slice/crio-be6ff32c0645914403c249280220a9a211158bf2e71884021d0658f8d941cecb WatchSource:0}: Error finding container be6ff32c0645914403c249280220a9a211158bf2e71884021d0658f8d941cecb: Status 404 returned error can't find the container with id be6ff32c0645914403c249280220a9a211158bf2e71884021d0658f8d941cecb Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.470625 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f8pt2"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.478792 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.486739 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5616bfbc_527c_4c99_b78e_7568c26ca4bb.slice/crio-38078bea7257782a04860f1a5eb314a7e1036f8e8b0b6c231f49ad0bfe4fb67e WatchSource:0}: Error finding container 38078bea7257782a04860f1a5eb314a7e1036f8e8b0b6c231f49ad0bfe4fb67e: Status 404 returned error can't find the container with id 38078bea7257782a04860f1a5eb314a7e1036f8e8b0b6c231f49ad0bfe4fb67e Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.488670 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae41eabe_e336_4ef9_9f65_022996a62860.slice/crio-83760a4ac4ab3954ec9a2b30c57147ed599cbd9137833e14dddc5ae30470a320 WatchSource:0}: Error finding container 83760a4ac4ab3954ec9a2b30c57147ed599cbd9137833e14dddc5ae30470a320: Status 404 returned error can't find the container with id 83760a4ac4ab3954ec9a2b30c57147ed599cbd9137833e14dddc5ae30470a320 Dec 05 20:44:32 crc kubenswrapper[4747]: W1205 20:44:32.488855 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ee658b1_3098_41d3_89c1_eec71d92d82e.slice/crio-f1e22480f22284caca9e9862b8ba25fb0ad53cfb6b69c47008142e0e20d42b2e WatchSource:0}: Error finding container f1e22480f22284caca9e9862b8ba25fb0ad53cfb6b69c47008142e0e20d42b2e: Status 404 returned error can't find the container with id f1e22480f22284caca9e9862b8ba25fb0ad53cfb6b69c47008142e0e20d42b2e Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.489357 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1caa95ad-0f17-4c3c-a336-2398afe5db0e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.498818 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.518467 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.523274 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1caa95ad-0f17-4c3c-a336-2398afe5db0e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.538663 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.555266 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.558237 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.582538 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.591789 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b672eb13-d702-4613-90d2-5d2406923876-metrics-tls\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.599299 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.618944 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" event={"ID":"ae41eabe-e336-4ef9-9f65-022996a62860","Type":"ContainerStarted","Data":"83760a4ac4ab3954ec9a2b30c57147ed599cbd9137833e14dddc5ae30470a320"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.619221 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.630272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" event={"ID":"5616bfbc-527c-4c99-b78e-7568c26ca4bb","Type":"ContainerStarted","Data":"38078bea7257782a04860f1a5eb314a7e1036f8e8b0b6c231f49ad0bfe4fb67e"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.639854 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.659745 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.676058 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" event={"ID":"0531af2e-7c38-4f8e-8ca3-0c4ac5148059","Type":"ContainerDied","Data":"214ebf93e02854c2fb7875acd4fb7566d62c7e2b0b743c0950de7a691c97e9b9"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.677100 4747 generic.go:334] "Generic (PLEG): container finished" podID="0531af2e-7c38-4f8e-8ca3-0c4ac5148059" containerID="214ebf93e02854c2fb7875acd4fb7566d62c7e2b0b743c0950de7a691c97e9b9" exitCode=0 Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.677242 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" event={"ID":"0531af2e-7c38-4f8e-8ca3-0c4ac5148059","Type":"ContainerStarted","Data":"44d325751ab85220ccd346df6a841b6717f573a23899aa44d224e1cc67471c2f"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.678802 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.680279 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" event={"ID":"82d08f0b-8d21-4b31-b88f-44d5578c1f03","Type":"ContainerStarted","Data":"0d7bf2cc0a1ce7f720a3c22a05187a9cd2ad8e7fa1322ec2ba50d99edf777672"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.685488 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" event={"ID":"1aa46678-7bb8-4017-b1c2-b61dd951ffd4","Type":"ContainerStarted","Data":"b44700c4999d84937d4588ab5947d48f888f4a57ceb91a310f84b73f1065774e"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.688833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" event={"ID":"3ee658b1-3098-41d3-89c1-eec71d92d82e","Type":"ContainerStarted","Data":"f1e22480f22284caca9e9862b8ba25fb0ad53cfb6b69c47008142e0e20d42b2e"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.691458 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hrsvp" event={"ID":"4d7ce274-8069-41a9-949f-a2e493c170a5","Type":"ContainerStarted","Data":"cb32e4d02d3c55a4fca8b1b5e46d4a0ff9ff8e23e41c3caae8af72a3d751127b"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.691491 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hrsvp" event={"ID":"4d7ce274-8069-41a9-949f-a2e493c170a5","Type":"ContainerStarted","Data":"6453e640f3d61c20c6f58f85460ff8958f36eee87b01ab6c57818b1a644436ab"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.691703 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.693232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb4b9" event={"ID":"dab4218f-b06c-473f-8882-5f207a79f403","Type":"ContainerStarted","Data":"83f53625aca63a19341cca8751bf495cbc9a60d65f84cb3a6d1eeffbf8214a04"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.694204 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrsvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.694263 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hrsvp" podUID="4d7ce274-8069-41a9-949f-a2e493c170a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.695054 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" event={"ID":"727707b0-2824-4736-afed-38b1efdb98de","Type":"ContainerStarted","Data":"d5642abebb4540b0741cbdfd20c21e9cc9c324dc13ee37cf931b0fd82712661c"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.695099 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" event={"ID":"727707b0-2824-4736-afed-38b1efdb98de","Type":"ContainerStarted","Data":"be6ff32c0645914403c249280220a9a211158bf2e71884021d0658f8d941cecb"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.695253 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.698537 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.700286 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" event={"ID":"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a","Type":"ContainerStarted","Data":"e53770d59abc28b7253e070b8a83eb05e6a6b29dd0ba2987c3d1c14aa2b19bda"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.700334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" event={"ID":"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a","Type":"ContainerStarted","Data":"48506cb53c574821752b55228e50470f0f4ba99910c481118e05f0a76f5bbda8"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.700349 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" event={"ID":"1840e48c-ee91-48d2-8ddb-34f24dde58ff","Type":"ContainerStarted","Data":"cefb69fea1945d86a6f44be18c09aa34679b25a9316cb4f8d8a92740c839c6e8"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.705313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" event={"ID":"ec0e9905-637c-4a71-b816-12fefbb801d2","Type":"ContainerStarted","Data":"54d8f408ddc8b683cc555741bb53001f9eaf42b542ac78598e17aae65e55c3d3"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.705639 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.705932 4747 patch_prober.go:28] interesting pod/console-operator-58897d9998-fkw6v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.705990 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" podUID="727707b0-2824-4736-afed-38b1efdb98de" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.707294 4747 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sntqs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.707334 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.708232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" event={"ID":"1018779a-353b-4530-84e1-b52a044d69d5","Type":"ContainerStarted","Data":"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.708256 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" event={"ID":"1018779a-353b-4530-84e1-b52a044d69d5","Type":"ContainerStarted","Data":"1fec1e715976632c6c419ba3b2ddb58dbf9d0a3eb43230f2da12e4f196c82757"} Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.708771 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.719687 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.740807 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.772467 4747 request.go:700] Waited for 1.007025247s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dregistry-dockercfg-kzzsd&limit=500&resourceVersion=0 Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.778108 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.779952 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.798518 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.818762 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.832032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a225e30-e0a1-4eab-8172-890abdc72200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.833794 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.837721 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.842446 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225e30-e0a1-4eab-8172-890abdc72200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.858958 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.879443 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.887641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-metrics-certs\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.898957 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.918857 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.930400 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-default-certificate\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.938726 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.945699 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-stats-auth\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.951757 4747 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.951863 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls podName:c77cec54-eef5-4a5b-acc7-395a55ec4be1 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:33.451838389 +0000 UTC m=+143.919145867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls") pod "machine-config-controller-84d6567774-7cr96" (UID: "c77cec54-eef5-4a5b-acc7-395a55ec4be1") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.952601 4747 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.952645 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle podName:facd7fa0-d8a3-4fd4-a8c9-64578d2779aa nodeName:}" failed. No retries permitted until 2025-12-05 20:44:33.452636821 +0000 UTC m=+143.919944309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle") pod "router-default-5444994796-2p8p6" (UID: "facd7fa0-d8a3-4fd4-a8c9-64578d2779aa") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.952669 4747 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.952698 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls podName:699936b9-cebb-4705-8e8e-abfeae4705f6 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:33.452692213 +0000 UTC m=+143.919999701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls") pod "machine-config-operator-74547568cd-s47qc" (UID: "699936b9-cebb-4705-8e8e-abfeae4705f6") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.954430 4747 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: E1205 20:44:32.954532 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images podName:699936b9-cebb-4705-8e8e-abfeae4705f6 nodeName:}" failed. No retries permitted until 2025-12-05 20:44:33.454511633 +0000 UTC m=+143.921819121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images") pod "machine-config-operator-74547568cd-s47qc" (UID: "699936b9-cebb-4705-8e8e-abfeae4705f6") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.961570 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:44:32 crc kubenswrapper[4747]: I1205 20:44:32.982379 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.000100 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.020289 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.040453 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.059043 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.078875 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.101288 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.122074 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.159652 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.180109 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.198854 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.218605 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.239607 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.258898 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.279158 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.298629 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.322977 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.341320 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.358429 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.379599 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.398770 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.422710 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.438326 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.458971 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.478838 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.491729 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.491877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.491916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.491937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.492714 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/699936b9-cebb-4705-8e8e-abfeae4705f6-images\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.493144 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-service-ca-bundle\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.497762 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/699936b9-cebb-4705-8e8e-abfeae4705f6-proxy-tls\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.497966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c77cec54-eef5-4a5b-acc7-395a55ec4be1-proxy-tls\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.499238 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.522525 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.539233 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.567777 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.587690 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.600710 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.619077 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.642382 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.662294 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.681308 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.698822 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.713881 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" event={"ID":"1840e48c-ee91-48d2-8ddb-34f24dde58ff","Type":"ContainerStarted","Data":"d5ceedb61b3fe147a8a7b5afc2d8784cf8998339eff765bf3913d771d45743cb"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.713924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" event={"ID":"1840e48c-ee91-48d2-8ddb-34f24dde58ff","Type":"ContainerStarted","Data":"90a08b6c8616163e8a1b57e2c61fd6736062296aef77362b3535c3b22c637646"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.715692 4747 generic.go:334] "Generic (PLEG): container finished" podID="3ee658b1-3098-41d3-89c1-eec71d92d82e" containerID="a18a932445c82dcd40b0c760aeb512b37bdc4b0e2c90206929d1fb00e7a3a808" exitCode=0 Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.715744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" event={"ID":"3ee658b1-3098-41d3-89c1-eec71d92d82e","Type":"ContainerDied","Data":"a18a932445c82dcd40b0c760aeb512b37bdc4b0e2c90206929d1fb00e7a3a808"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.718820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" event={"ID":"1aa46678-7bb8-4017-b1c2-b61dd951ffd4","Type":"ContainerStarted","Data":"fd3933ac1f1c984fb64d4f40748172c0d6d8880f76ccfb0a4842a52f959a1766"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.718849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" event={"ID":"1aa46678-7bb8-4017-b1c2-b61dd951ffd4","Type":"ContainerStarted","Data":"d941aa25f87763c61271ef9cd879a6f76416c38d3b06e63724381662fb4bc1a2"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.720411 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" event={"ID":"5616bfbc-527c-4c99-b78e-7568c26ca4bb","Type":"ContainerStarted","Data":"6f15818d06ecfa29140aacb808b62712eab47ed844e284c917fa21e0c415fc83"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.721081 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.721852 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb4b9" event={"ID":"dab4218f-b06c-473f-8882-5f207a79f403","Type":"ContainerStarted","Data":"59ec24865736b6d8fd6e2e2c4a3afcaef18bf1b988a1a839b10fe4dde9c8c776"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.724772 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" event={"ID":"d9e3473c-c3ec-4dd9-8c52-0f4dcaf29a4a","Type":"ContainerStarted","Data":"f3f05601ff9bb2dfe7555bb158be934f5a076948fbfa7ca520677f300d77ac77"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.726653 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" event={"ID":"ec0e9905-637c-4a71-b816-12fefbb801d2","Type":"ContainerStarted","Data":"56995b51bd9d2635e9aef2720b9f0e9863c017af87f034d9042e1a8eff18de6d"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.730844 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" event={"ID":"0531af2e-7c38-4f8e-8ca3-0c4ac5148059","Type":"ContainerStarted","Data":"eea582a82fb8f819fca2d9b40af88061138aa65aa9ea06d1483a4e37c654e4a3"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.731565 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.731891 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.733193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" event={"ID":"ae41eabe-e336-4ef9-9f65-022996a62860","Type":"ContainerStarted","Data":"570ccc07b8765b8c7c1d6a09c7cc3018a067e1e96a57cc4ab3572f68fa3ce784"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.733480 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.734464 4747 generic.go:334] "Generic (PLEG): container finished" podID="82d08f0b-8d21-4b31-b88f-44d5578c1f03" containerID="33f32d9b224063a677e9048d0a480cc0f9876c6de131d632f2cfd40f4ac9b9f3" exitCode=0 Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.735771 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" event={"ID":"82d08f0b-8d21-4b31-b88f-44d5578c1f03","Type":"ContainerDied","Data":"33f32d9b224063a677e9048d0a480cc0f9876c6de131d632f2cfd40f4ac9b9f3"} Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.736246 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrsvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.736290 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hrsvp" podUID="4d7ce274-8069-41a9-949f-a2e493c170a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.743266 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.747960 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.748016 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.758208 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.777000 4747 request.go:700] Waited for 1.954613232s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.780163 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.799660 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.818889 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.843089 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.858992 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.888128 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.898365 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.918540 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.939986 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.959508 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:44:33 crc kubenswrapper[4747]: I1205 20:44:33.982646 4747 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.019844 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.038864 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.082184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlt5s\" (UniqueName: \"kubernetes.io/projected/72d9939d-e77a-469d-9c50-32bc6bfdf498-kube-api-access-mlt5s\") pod \"etcd-operator-b45778765-9jq4j\" (UID: \"72d9939d-e77a-469d-9c50-32bc6bfdf498\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.095412 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lzhm\" (UniqueName: \"kubernetes.io/projected/c77cec54-eef5-4a5b-acc7-395a55ec4be1-kube-api-access-4lzhm\") pod \"machine-config-controller-84d6567774-7cr96\" (UID: \"c77cec54-eef5-4a5b-acc7-395a55ec4be1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.119673 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6mx\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-kube-api-access-tx6mx\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.132988 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tbtb\" (UniqueName: \"kubernetes.io/projected/506d2ee8-fd84-4d9d-8d72-7253e237012a-kube-api-access-2tbtb\") pod \"openshift-apiserver-operator-796bbdcf4f-2q84t\" (UID: \"506d2ee8-fd84-4d9d-8d72-7253e237012a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.143821 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.158725 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a225e30-e0a1-4eab-8172-890abdc72200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mjxxq\" (UID: \"8a225e30-e0a1-4eab-8172-890abdc72200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.180960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrs9w\" (UniqueName: \"kubernetes.io/projected/b672eb13-d702-4613-90d2-5d2406923876-kube-api-access-mrs9w\") pod \"dns-operator-744455d44c-d88cp\" (UID: \"b672eb13-d702-4613-90d2-5d2406923876\") " pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.191642 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.194741 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwkr4\" (UniqueName: \"kubernetes.io/projected/facd7fa0-d8a3-4fd4-a8c9-64578d2779aa-kube-api-access-kwkr4\") pod \"router-default-5444994796-2p8p6\" (UID: \"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa\") " pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.216286 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcrc\" (UniqueName: \"kubernetes.io/projected/5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5-kube-api-access-bxcrc\") pod \"authentication-operator-69f744f599-mgkmr\" (UID: \"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.238375 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2003fdd-5c5a-4f60-9920-bd2e736a0f2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bd4xm\" (UID: \"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.258614 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.266276 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92pd\" (UniqueName: \"kubernetes.io/projected/699936b9-cebb-4705-8e8e-abfeae4705f6-kube-api-access-q92pd\") pod \"machine-config-operator-74547568cd-s47qc\" (UID: \"699936b9-cebb-4705-8e8e-abfeae4705f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.280909 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.281746 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvdh\" (UniqueName: \"kubernetes.io/projected/1caa95ad-0f17-4c3c-a336-2398afe5db0e-kube-api-access-2rvdh\") pod \"openshift-controller-manager-operator-756b6f6bc6-cc2b6\" (UID: \"1caa95ad-0f17-4c3c-a336-2398afe5db0e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.293515 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.302904 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdf9f\" (UniqueName: \"kubernetes.io/projected/342bdc3a-ef90-4837-b0d2-e5e9bc63b821-kube-api-access-mdf9f\") pod \"dns-default-xx4lj\" (UID: \"342bdc3a-ef90-4837-b0d2-e5e9bc63b821\") " pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.318155 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.318226 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.318909 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:34.818897838 +0000 UTC m=+145.286205326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.354672 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" Dec 05 20:44:34 crc kubenswrapper[4747]: W1205 20:44:34.380468 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacd7fa0_d8a3_4fd4_a8c9_64578d2779aa.slice/crio-1db513342f8f413aa94cacaf5605065cf7ee17b000cf23d07d77a0c89550b7f0 WatchSource:0}: Error finding container 1db513342f8f413aa94cacaf5605065cf7ee17b000cf23d07d77a0c89550b7f0: Status 404 returned error can't find the container with id 1db513342f8f413aa94cacaf5605065cf7ee17b000cf23d07d77a0c89550b7f0 Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.420632 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.421197 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:34.921173707 +0000 UTC m=+145.388481195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2qh\" (UniqueName: \"kubernetes.io/projected/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-kube-api-access-qh2qh\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421816 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjns\" (UniqueName: \"kubernetes.io/projected/2d999cbd-6286-44a8-ab6b-8358cf8ac970-kube-api-access-nvjns\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421836 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a3fdfa4-f822-4da3-af90-ee5134c82485-tmpfs\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf79\" (UniqueName: \"kubernetes.io/projected/9fab8078-6c33-4eac-8aa5-cbbfe197920b-kube-api-access-rjf79\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421892 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.421908 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw528\" (UniqueName: \"kubernetes.io/projected/b713204a-efc1-4683-b9fc-6e026532376b-kube-api-access-zw528\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.423042 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/280e109e-854a-430c-9960-6af6bba9dfdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.429918 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2577\" (UniqueName: \"kubernetes.io/projected/e6f561cd-d447-44b3-9680-d2a6938e0a7b-kube-api-access-w2577\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.429989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgwg\" (UniqueName: \"kubernetes.io/projected/280e109e-854a-430c-9960-6af6bba9dfdc-kube-api-access-qqgwg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.430028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-node-bootstrap-token\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.430049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b87a4-6069-4c33-9e0f-c2ef09d47781-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433422 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d999cbd-6286-44a8-ab6b-8358cf8ac970-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433466 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk56w\" (UniqueName: \"kubernetes.io/projected/001b570f-e334-4e40-9014-857430c089be-kube-api-access-bk56w\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433491 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82b87a4-6069-4c33-9e0f-c2ef09d47781-config\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433520 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-cabundle\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433726 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/742f316e-5676-4459-bfd5-411abe809f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742f316e-5676-4459-bfd5-411abe809f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82b87a4-6069-4c33-9e0f-c2ef09d47781-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433828 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433857 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b75cx\" (UniqueName: \"kubernetes.io/projected/448cf9b6-9081-40c4-8984-a75fb61cd5dd-kube-api-access-b75cx\") pod \"migrator-59844c95c7-65jzc\" (UID: \"448cf9b6-9081-40c4-8984-a75fb61cd5dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.433878 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-srv-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.434015 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-srv-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.434095 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.447747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.448339 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.462341 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.462701 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n659\" (UniqueName: \"kubernetes.io/projected/3a3fdfa4-f822-4da3-af90-ee5134c82485-kube-api-access-2n659\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.462823 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.462937 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-apiservice-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463012 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463041 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2bqk\" (UniqueName: \"kubernetes.io/projected/34d2d560-7ce6-4a88-8637-b99f3c3c367a-kube-api-access-t2bqk\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463109 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-key\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-webhook-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463465 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-certs\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463637 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbt7\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnjz8\" (UniqueName: \"kubernetes.io/projected/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-kube-api-access-mnjz8\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.463878 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b713204a-efc1-4683-b9fc-6e026532376b-cert\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.464084 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.464166 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742f316e-5676-4459-bfd5-411abe809f23-config\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.466508 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:34.966488464 +0000 UTC m=+145.433795952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.475664 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.516144 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.541846 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.550399 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" podStartSLOduration=125.550363718 podStartE2EDuration="2m5.550363718s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:34.51536893 +0000 UTC m=+144.982676438" watchObservedRunningTime="2025-12-05 20:44:34.550363718 +0000 UTC m=+145.017671206" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.550655 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t"] Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566310 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-plugins-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566348 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ts6\" (UniqueName: \"kubernetes.io/projected/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-kube-api-access-d2ts6\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566396 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n659\" (UniqueName: \"kubernetes.io/projected/3a3fdfa4-f822-4da3-af90-ee5134c82485-kube-api-access-2n659\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-apiservice-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2bqk\" (UniqueName: \"kubernetes.io/projected/34d2d560-7ce6-4a88-8637-b99f3c3c367a-kube-api-access-t2bqk\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-key\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566495 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566546 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-webhook-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566760 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-csi-data-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566841 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-certs\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.566885 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbt7\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.570905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnjz8\" (UniqueName: \"kubernetes.io/projected/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-kube-api-access-mnjz8\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.570997 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b713204a-efc1-4683-b9fc-6e026532376b-cert\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571054 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbdkj\" (UniqueName: \"kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571167 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742f316e-5676-4459-bfd5-411abe809f23-config\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqzn\" (UniqueName: \"kubernetes.io/projected/c64af899-04ea-4380-b287-4f3a2dc1bc5b-kube-api-access-6jqzn\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-config\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571541 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571575 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2qh\" (UniqueName: \"kubernetes.io/projected/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-kube-api-access-qh2qh\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571633 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjns\" (UniqueName: \"kubernetes.io/projected/2d999cbd-6286-44a8-ab6b-8358cf8ac970-kube-api-access-nvjns\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571661 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a3fdfa4-f822-4da3-af90-ee5134c82485-tmpfs\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571693 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf79\" (UniqueName: \"kubernetes.io/projected/9fab8078-6c33-4eac-8aa5-cbbfe197920b-kube-api-access-rjf79\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571722 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-mountpoint-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571786 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw528\" (UniqueName: \"kubernetes.io/projected/b713204a-efc1-4683-b9fc-6e026532376b-kube-api-access-zw528\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571830 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/280e109e-854a-430c-9960-6af6bba9dfdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571869 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2577\" (UniqueName: \"kubernetes.io/projected/e6f561cd-d447-44b3-9680-d2a6938e0a7b-kube-api-access-w2577\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgwg\" (UniqueName: \"kubernetes.io/projected/280e109e-854a-430c-9960-6af6bba9dfdc-kube-api-access-qqgwg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-node-bootstrap-token\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.571988 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b87a4-6069-4c33-9e0f-c2ef09d47781-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572017 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk56w\" (UniqueName: \"kubernetes.io/projected/001b570f-e334-4e40-9014-857430c089be-kube-api-access-bk56w\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572041 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82b87a4-6069-4c33-9e0f-c2ef09d47781-config\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-cabundle\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572093 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572121 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d999cbd-6286-44a8-ab6b-8358cf8ac970-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572152 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-socket-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572178 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-serving-cert\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572265 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/742f316e-5676-4459-bfd5-411abe809f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572301 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742f316e-5676-4459-bfd5-411abe809f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572329 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82b87a4-6069-4c33-9e0f-c2ef09d47781-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b75cx\" (UniqueName: \"kubernetes.io/projected/448cf9b6-9081-40c4-8984-a75fb61cd5dd-kube-api-access-b75cx\") pod \"migrator-59844c95c7-65jzc\" (UID: \"448cf9b6-9081-40c4-8984-a75fb61cd5dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572390 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-srv-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572416 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjlzb\" (UniqueName: \"kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572517 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-registration-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.572557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-srv-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.574021 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.574143 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.074121722 +0000 UTC m=+145.541429210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.575411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a3fdfa4-f822-4da3-af90-ee5134c82485-tmpfs\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.576235 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.576898 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82b87a4-6069-4c33-9e0f-c2ef09d47781-config\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.577555 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.577605 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-cabundle\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.581368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742f316e-5676-4459-bfd5-411abe809f23-config\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.582989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.596022 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.598895 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.605558 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbt7\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.608490 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.609190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-node-bootstrap-token\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.610498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b713204a-efc1-4683-b9fc-6e026532376b-cert\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.610721 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/34d2d560-7ce6-4a88-8637-b99f3c3c367a-signing-key\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.611164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-srv-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.611214 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742f316e-5676-4459-bfd5-411abe809f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.611170 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d999cbd-6286-44a8-ab6b-8358cf8ac970-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.611627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/001b570f-e334-4e40-9014-857430c089be-srv-cert\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.612120 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-webhook-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.612617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/280e109e-854a-430c-9960-6af6bba9dfdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.613726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e6f561cd-d447-44b3-9680-d2a6938e0a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.622331 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw528\" (UniqueName: \"kubernetes.io/projected/b713204a-efc1-4683-b9fc-6e026532376b-kube-api-access-zw528\") pod \"ingress-canary-mmqsc\" (UID: \"b713204a-efc1-4683-b9fc-6e026532376b\") " pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.622852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a3fdfa4-f822-4da3-af90-ee5134c82485-apiservice-cert\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.623858 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.625644 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9fab8078-6c33-4eac-8aa5-cbbfe197920b-certs\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.636330 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82b87a4-6069-4c33-9e0f-c2ef09d47781-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.651517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n659\" (UniqueName: \"kubernetes.io/projected/3a3fdfa4-f822-4da3-af90-ee5134c82485-kube-api-access-2n659\") pod \"packageserver-d55dfcdfc-gxkvw\" (UID: \"3a3fdfa4-f822-4da3-af90-ee5134c82485\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.667244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjns\" (UniqueName: \"kubernetes.io/projected/2d999cbd-6286-44a8-ab6b-8358cf8ac970-kube-api-access-nvjns\") pod \"multus-admission-controller-857f4d67dd-v44jl\" (UID: \"2d999cbd-6286-44a8-ab6b-8358cf8ac970\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.667480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2qh\" (UniqueName: \"kubernetes.io/projected/f8e654b5-d104-4c65-9653-d1f1c55c3c7e-kube-api-access-qh2qh\") pod \"kube-storage-version-migrator-operator-b67b599dd-xbg5h\" (UID: \"f8e654b5-d104-4c65-9653-d1f1c55c3c7e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.687880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgwg\" (UniqueName: \"kubernetes.io/projected/280e109e-854a-430c-9960-6af6bba9dfdc-kube-api-access-qqgwg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xgqvl\" (UID: \"280e109e-854a-430c-9960-6af6bba9dfdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-mountpoint-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699128 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699178 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-socket-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699196 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-serving-cert\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjlzb\" (UniqueName: \"kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699295 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-registration-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-plugins-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ts6\" (UniqueName: \"kubernetes.io/projected/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-kube-api-access-d2ts6\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699388 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699449 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-csi-data-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699477 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbdkj\" (UniqueName: \"kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699558 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqzn\" (UniqueName: \"kubernetes.io/projected/c64af899-04ea-4380-b287-4f3a2dc1bc5b-kube-api-access-6jqzn\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699618 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-config\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.699642 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.700957 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.701060 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mmqsc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.701393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-mountpoint-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.715773 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.716030 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.216018928 +0000 UTC m=+145.683326416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.716103 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-csi-data-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.716453 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-socket-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.716504 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-registration-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.716535 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c64af899-04ea-4380-b287-4f3a2dc1bc5b-plugins-dir\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.719550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-serving-cert\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.727966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.732097 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-config\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.742726 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jq4j"] Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.744125 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk56w\" (UniqueName: \"kubernetes.io/projected/001b570f-e334-4e40-9014-857430c089be-kube-api-access-bk56w\") pod \"olm-operator-6b444d44fb-vr9tk\" (UID: \"001b570f-e334-4e40-9014-857430c089be\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.744211 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.744788 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnjz8\" (UniqueName: \"kubernetes.io/projected/fccb227e-229a-4054-a0ad-43dc8aa9f0a3-kube-api-access-mnjz8\") pod \"package-server-manager-789f6589d5-ncr5s\" (UID: \"fccb227e-229a-4054-a0ad-43dc8aa9f0a3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.758018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b87a4-6069-4c33-9e0f-c2ef09d47781-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pv9z\" (UID: \"a82b87a4-6069-4c33-9e0f-c2ef09d47781\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.760463 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.764136 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.766549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf79\" (UniqueName: \"kubernetes.io/projected/9fab8078-6c33-4eac-8aa5-cbbfe197920b-kube-api-access-rjf79\") pod \"machine-config-server-drfsj\" (UID: \"9fab8078-6c33-4eac-8aa5-cbbfe197920b\") " pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.776922 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-drfsj" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.790659 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" event={"ID":"3ee658b1-3098-41d3-89c1-eec71d92d82e","Type":"ContainerStarted","Data":"9fe48947d35cac162b98900e0d01b8fa6a6298b3de83f5a6208e9b62e9a68ffb"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.790717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" event={"ID":"3ee658b1-3098-41d3-89c1-eec71d92d82e","Type":"ContainerStarted","Data":"ba0f3a2211a0c3431bf5c9c4930b3a06c404a35ab52c8851c8e402398e883396"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.795541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2577\" (UniqueName: \"kubernetes.io/projected/e6f561cd-d447-44b3-9680-d2a6938e0a7b-kube-api-access-w2577\") pod \"catalog-operator-68c6474976-5zzxf\" (UID: \"e6f561cd-d447-44b3-9680-d2a6938e0a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: W1205 20:44:34.796505 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d9939d_e77a_469d_9c50_32bc6bfdf498.slice/crio-d739a3f106ab4886a0e7e3888d3bfe4cbf0671332a10644981f6f53f9f27b321 WatchSource:0}: Error finding container d739a3f106ab4886a0e7e3888d3bfe4cbf0671332a10644981f6f53f9f27b321: Status 404 returned error can't find the container with id d739a3f106ab4886a0e7e3888d3bfe4cbf0671332a10644981f6f53f9f27b321 Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.806046 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.806535 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.306516558 +0000 UTC m=+145.773824046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.806641 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.818479 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2bqk\" (UniqueName: \"kubernetes.io/projected/34d2d560-7ce6-4a88-8637-b99f3c3c367a-kube-api-access-t2bqk\") pod \"service-ca-9c57cc56f-glgf7\" (UID: \"34d2d560-7ce6-4a88-8637-b99f3c3c367a\") " pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.824595 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq"] Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.836619 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" event={"ID":"82d08f0b-8d21-4b31-b88f-44d5578c1f03","Type":"ContainerStarted","Data":"263376ac2bca005894c6882116c3457ffcad0f7d0c82ec992b24571e79433387"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.844889 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.845152 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.854415 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/742f316e-5676-4459-bfd5-411abe809f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ssb48\" (UID: \"742f316e-5676-4459-bfd5-411abe809f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.854940 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b75cx\" (UniqueName: \"kubernetes.io/projected/448cf9b6-9081-40c4-8984-a75fb61cd5dd-kube-api-access-b75cx\") pod \"migrator-59844c95c7-65jzc\" (UID: \"448cf9b6-9081-40c4-8984-a75fb61cd5dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.866508 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.867100 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" event={"ID":"506d2ee8-fd84-4d9d-8d72-7253e237012a","Type":"ContainerStarted","Data":"b0c3e593fb81409012214381aed37f1fccc37b470c3367683c78ea67caac05e4"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.876612 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2p8p6" event={"ID":"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa","Type":"ContainerStarted","Data":"583d9fa1bb04e6e7ab32e233627259233bcfc5d961caa4de2c5d4b81aa42a2c5"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.876653 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2p8p6" event={"ID":"facd7fa0-d8a3-4fd4-a8c9-64578d2779aa","Type":"ContainerStarted","Data":"1db513342f8f413aa94cacaf5605065cf7ee17b000cf23d07d77a0c89550b7f0"} Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.899724 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ts6\" (UniqueName: \"kubernetes.io/projected/2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b-kube-api-access-d2ts6\") pod \"service-ca-operator-777779d784-75r5s\" (UID: \"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.903136 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqzn\" (UniqueName: \"kubernetes.io/projected/c64af899-04ea-4380-b287-4f3a2dc1bc5b-kube-api-access-6jqzn\") pod \"csi-hostpathplugin-ddtpz\" (UID: \"c64af899-04ea-4380-b287-4f3a2dc1bc5b\") " pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.906285 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.908203 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:34 crc kubenswrapper[4747]: E1205 20:44:34.911386 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.411367269 +0000 UTC m=+145.878674757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.956473 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbdkj\" (UniqueName: \"kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj\") pod \"marketplace-operator-79b997595-svs8d\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.964607 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.968437 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjlzb\" (UniqueName: \"kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb\") pod \"collect-profiles-29416110-l4svr\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.983874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:34 crc kubenswrapper[4747]: I1205 20:44:34.997113 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.005090 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.009511 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.011623 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.51159694 +0000 UTC m=+145.978904428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.022848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.038874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.040333 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96"] Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.053960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.109409 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.114567 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.133095 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cdhgg" podStartSLOduration=127.133071875 podStartE2EDuration="2m7.133071875s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.128885398 +0000 UTC m=+145.596192886" watchObservedRunningTime="2025-12-05 20:44:35.133071875 +0000 UTC m=+145.600379363" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.134716 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.63466608 +0000 UTC m=+146.101973568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.155424 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.227931 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.228274 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.728260506 +0000 UTC m=+146.195567994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.284132 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h25fx" podStartSLOduration=127.284112717 podStartE2EDuration="2m7.284112717s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.277611925 +0000 UTC m=+145.744919403" watchObservedRunningTime="2025-12-05 20:44:35.284112717 +0000 UTC m=+145.751420205" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.302455 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:35 crc kubenswrapper[4747]: W1205 20:44:35.302993 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77cec54_eef5_4a5b_acc7_395a55ec4be1.slice/crio-dfb431375714664c2bfa91e6ec2e84a9483317e53b5f3e90a10bec4a542c17a4 WatchSource:0}: Error finding container dfb431375714664c2bfa91e6ec2e84a9483317e53b5f3e90a10bec4a542c17a4: Status 404 returned error can't find the container with id dfb431375714664c2bfa91e6ec2e84a9483317e53b5f3e90a10bec4a542c17a4 Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.324333 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" podStartSLOduration=127.324317191 podStartE2EDuration="2m7.324317191s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.323661153 +0000 UTC m=+145.790968661" watchObservedRunningTime="2025-12-05 20:44:35.324317191 +0000 UTC m=+145.791624679" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.326782 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc"] Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.329683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.330058 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.830038671 +0000 UTC m=+146.297346159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.355474 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d88cp"] Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.401125 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fkw6v" podStartSLOduration=127.401104287 podStartE2EDuration="2m7.401104287s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.357168689 +0000 UTC m=+145.824476167" watchObservedRunningTime="2025-12-05 20:44:35.401104287 +0000 UTC m=+145.868411775" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.430930 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.431294 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:35.931279871 +0000 UTC m=+146.398587359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.514291 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:35 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:35 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:35 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.514344 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.537396 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.537785 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.037773957 +0000 UTC m=+146.505081445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.573294 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hrsvp" podStartSLOduration=127.573272079 podStartE2EDuration="2m7.573272079s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.568385413 +0000 UTC m=+146.035692901" watchObservedRunningTime="2025-12-05 20:44:35.573272079 +0000 UTC m=+146.040579567" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.633516 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rb4b9" podStartSLOduration=127.633498193 podStartE2EDuration="2m7.633498193s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:35.619032308 +0000 UTC m=+146.086339796" watchObservedRunningTime="2025-12-05 20:44:35.633498193 +0000 UTC m=+146.100805681" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.639044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.640108 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.140081367 +0000 UTC m=+146.607388855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.741047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.741496 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.241480391 +0000 UTC m=+146.708787879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.842975 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.843150 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.343121412 +0000 UTC m=+146.810428900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.843220 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.843297 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.843357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.843403 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.843447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.844366 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.344356676 +0000 UTC m=+146.811664164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.846511 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.863602 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.877252 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.880956 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.886204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.945213 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.949964 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.449930537 +0000 UTC m=+146.917238025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.950892 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:35 crc kubenswrapper[4747]: E1205 20:44:35.951334 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.451323016 +0000 UTC m=+146.918630504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.956735 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6"] Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.956801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" event={"ID":"699936b9-cebb-4705-8e8e-abfeae4705f6","Type":"ContainerStarted","Data":"f0df0806603c5366b0e06bee12ae057c3fc1fda5aea1aa610c42709925290a37"} Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.956827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" event={"ID":"506d2ee8-fd84-4d9d-8d72-7253e237012a","Type":"ContainerStarted","Data":"b13248ef74440061b1941b5f6c81156dc56d15b0da6e83528648fd00d5f2f056"} Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.978331 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgkmr"] Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.989365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" event={"ID":"c77cec54-eef5-4a5b-acc7-395a55ec4be1","Type":"ContainerStarted","Data":"5b8e6cb6f43a932ccfb399a8d751389651eb3313113f30c6088aea03285066f7"} Dec 05 20:44:35 crc kubenswrapper[4747]: I1205 20:44:35.989410 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" event={"ID":"c77cec54-eef5-4a5b-acc7-395a55ec4be1","Type":"ContainerStarted","Data":"dfb431375714664c2bfa91e6ec2e84a9483317e53b5f3e90a10bec4a542c17a4"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.016514 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" event={"ID":"b672eb13-d702-4613-90d2-5d2406923876","Type":"ContainerStarted","Data":"0ef2b590ba3be750ff8faee9a2a4adc4c6578b8a7bdd8315fd1b270a724691ad"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.051892 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.052337 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.552311689 +0000 UTC m=+147.019619177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.052377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.052717 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.55269703 +0000 UTC m=+147.020004518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.067409 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.068061 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-drfsj" event={"ID":"9fab8078-6c33-4eac-8aa5-cbbfe197920b","Type":"ContainerStarted","Data":"1fadbaac9a385be7bce9f221190b68a7fb746e1905caa694640fef8564679b69"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.068099 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-drfsj" event={"ID":"9fab8078-6c33-4eac-8aa5-cbbfe197920b","Type":"ContainerStarted","Data":"1e9ab613c8d52d329178603384006c6cd585e86d74f8441475ad0dd5c0084cf9"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.084407 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.110285 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" podStartSLOduration=128.110256578 podStartE2EDuration="2m8.110256578s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.072778081 +0000 UTC m=+146.540085929" watchObservedRunningTime="2025-12-05 20:44:36.110256578 +0000 UTC m=+146.577564066" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.111283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" event={"ID":"72d9939d-e77a-469d-9c50-32bc6bfdf498","Type":"ContainerStarted","Data":"c2d6704e141a6818a66d98cd7ca12d28c3d42c61b9d35bf712bd8aaa9fc25468"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.111331 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" event={"ID":"72d9939d-e77a-469d-9c50-32bc6bfdf498","Type":"ContainerStarted","Data":"d739a3f106ab4886a0e7e3888d3bfe4cbf0671332a10644981f6f53f9f27b321"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.114526 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" event={"ID":"8a225e30-e0a1-4eab-8172-890abdc72200","Type":"ContainerStarted","Data":"c14fe0caa5d52e3eb835ab9587d5df9ab82cd27fee2e9aa718edbfe99bb9ebd1"} Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.135852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f8hsv" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.153397 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.155331 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.655314328 +0000 UTC m=+147.122621816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.178941 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xx4lj"] Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.223795 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.224241 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.255647 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.261101 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.761063864 +0000 UTC m=+147.228371352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.328150 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pldz6" podStartSLOduration=129.328124718 podStartE2EDuration="2m9.328124718s" podCreationTimestamp="2025-12-05 20:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.327527751 +0000 UTC m=+146.794835239" watchObservedRunningTime="2025-12-05 20:44:36.328124718 +0000 UTC m=+146.795432216" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.360272 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.360869 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.860841273 +0000 UTC m=+147.328148771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.369622 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cxdgd" podStartSLOduration=128.369594137 podStartE2EDuration="2m8.369594137s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.36647556 +0000 UTC m=+146.833783048" watchObservedRunningTime="2025-12-05 20:44:36.369594137 +0000 UTC m=+146.836901625" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.379745 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:36 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:36 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:36 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.379813 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.424857 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" podStartSLOduration=128.424820501 podStartE2EDuration="2m8.424820501s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.422749283 +0000 UTC m=+146.890056771" watchObservedRunningTime="2025-12-05 20:44:36.424820501 +0000 UTC m=+146.892127989" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.487994 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.489382 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:36.989364555 +0000 UTC m=+147.456672043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.594689 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.598066 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.098046113 +0000 UTC m=+147.565353601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.611198 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2q84t" podStartSLOduration=128.61117491 podStartE2EDuration="2m8.61117491s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.607696922 +0000 UTC m=+147.075004410" watchObservedRunningTime="2025-12-05 20:44:36.61117491 +0000 UTC m=+147.078482398" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.648076 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9jq4j" podStartSLOduration=128.64805495 podStartE2EDuration="2m8.64805495s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.646883208 +0000 UTC m=+147.114190696" watchObservedRunningTime="2025-12-05 20:44:36.64805495 +0000 UTC m=+147.115362438" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.679593 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-drfsj" podStartSLOduration=5.679553101 podStartE2EDuration="5.679553101s" podCreationTimestamp="2025-12-05 20:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.678227404 +0000 UTC m=+147.145534892" watchObservedRunningTime="2025-12-05 20:44:36.679553101 +0000 UTC m=+147.146860589" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.700958 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" podStartSLOduration=128.700932058 podStartE2EDuration="2m8.700932058s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.699363424 +0000 UTC m=+147.166670912" watchObservedRunningTime="2025-12-05 20:44:36.700932058 +0000 UTC m=+147.168239546" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.702554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.703079 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.203058778 +0000 UTC m=+147.670366266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.761509 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2p8p6" podStartSLOduration=128.761481811 podStartE2EDuration="2m8.761481811s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.744924478 +0000 UTC m=+147.212231966" watchObservedRunningTime="2025-12-05 20:44:36.761481811 +0000 UTC m=+147.228789289" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.764795 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm"] Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.796778 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.797220 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.802442 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" podStartSLOduration=127.802429145 podStartE2EDuration="2m7.802429145s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:36.802030324 +0000 UTC m=+147.269337812" watchObservedRunningTime="2025-12-05 20:44:36.802429145 +0000 UTC m=+147.269736633" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.803382 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.803789 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.303771573 +0000 UTC m=+147.771079061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.840036 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h"] Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.850825 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mmqsc"] Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.872409 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.872850 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.875552 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc"] Dec 05 20:44:36 crc kubenswrapper[4747]: I1205 20:44:36.907533 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:36 crc kubenswrapper[4747]: E1205 20:44:36.908507 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.408475419 +0000 UTC m=+147.875783077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.008948 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.009131 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.509104712 +0000 UTC m=+147.976412200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.009556 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.009886 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.509877604 +0000 UTC m=+147.977185092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: W1205 20:44:37.072126 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2003fdd_5c5a_4f60_9920_bd2e736a0f2e.slice/crio-4107e04495d261349e59227b12b5526933402ac294c3607df5154635967250cc WatchSource:0}: Error finding container 4107e04495d261349e59227b12b5526933402ac294c3607df5154635967250cc: Status 404 returned error can't find the container with id 4107e04495d261349e59227b12b5526933402ac294c3607df5154635967250cc Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.077344 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.078825 4747 patch_prober.go:28] interesting pod/apiserver-76f77b778f-f8pt2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]log ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]etcd ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/max-in-flight-filter ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 20:44:37 crc kubenswrapper[4747]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 20:44:37 crc kubenswrapper[4747]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/openshift.io-startinformers ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 20:44:37 crc kubenswrapper[4747]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 20:44:37 crc kubenswrapper[4747]: livez check failed Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.078876 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" podUID="3ee658b1-3098-41d3-89c1-eec71d92d82e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.110760 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.111241 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.611225716 +0000 UTC m=+148.078533204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.213022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xx4lj" event={"ID":"342bdc3a-ef90-4837-b0d2-e5e9bc63b821","Type":"ContainerStarted","Data":"71afa26c34b0261a67550a5f9866fb2b89527f106b473caedf3fc450ca0031e0"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.215545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.216026 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.716011075 +0000 UTC m=+148.183318563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.226727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" event={"ID":"448cf9b6-9081-40c4-8984-a75fb61cd5dd","Type":"ContainerStarted","Data":"608124d6cabe548ee66043f251151451e7cfe3b3505bfd692341f87c1f22f706"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.252023 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" event={"ID":"1caa95ad-0f17-4c3c-a336-2398afe5db0e","Type":"ContainerStarted","Data":"fe690950da374d5b58a0e3965a34710a1aa3147499e55ab397a8c1c5b1b85efd"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.252065 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" event={"ID":"1caa95ad-0f17-4c3c-a336-2398afe5db0e","Type":"ContainerStarted","Data":"f9d4404f03d8fe26ea9a8432e18a1008b70113b172e4bd2cf1e28b75da47df2f"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.266390 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" event={"ID":"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5","Type":"ContainerStarted","Data":"129eeb1a4a3d36c7b4b37fac2b01af74183334029193dde8160e30f7089a3c3c"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.266459 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" event={"ID":"5f2e3c68-b51d-4b66-8d73-ff1c4a61cef5","Type":"ContainerStarted","Data":"6317a58f1965516f582d3cfadff54082d0841a8200564be715aeaca0c7c6df4e"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.292379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" event={"ID":"c77cec54-eef5-4a5b-acc7-395a55ec4be1","Type":"ContainerStarted","Data":"fe4cd18e87073116d606e3cd9fea20bcc2d980f3beaeaf73a7e0a6b54c7559ab"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.320913 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cc2b6" podStartSLOduration=129.320883226 podStartE2EDuration="2m9.320883226s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.278909742 +0000 UTC m=+147.746217230" watchObservedRunningTime="2025-12-05 20:44:37.320883226 +0000 UTC m=+147.788190714" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.321654 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:37 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:37 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:37 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.321723 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.322511 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.324138 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgkmr" podStartSLOduration=129.324127876 podStartE2EDuration="2m9.324127876s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.322202852 +0000 UTC m=+147.789510360" watchObservedRunningTime="2025-12-05 20:44:37.324127876 +0000 UTC m=+147.791435364" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.325442 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.825421802 +0000 UTC m=+148.292729290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.331495 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.337337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" event={"ID":"b672eb13-d702-4613-90d2-5d2406923876","Type":"ContainerStarted","Data":"f40f055cfc9de5ed02a791ff59f297a1d96a4b765ac2d95109c2d00f0f4ebe56"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.361267 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7cr96" podStartSLOduration=129.361237613 podStartE2EDuration="2m9.361237613s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.359407472 +0000 UTC m=+147.826714960" watchObservedRunningTime="2025-12-05 20:44:37.361237613 +0000 UTC m=+147.828545261" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.361371 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.378403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" event={"ID":"f8e654b5-d104-4c65-9653-d1f1c55c3c7e","Type":"ContainerStarted","Data":"b7ebf6e27bc73fb39b3c3c59f4678b907a2ed6c333d58c44bc18690e895a2b17"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.394310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" event={"ID":"8a225e30-e0a1-4eab-8172-890abdc72200","Type":"ContainerStarted","Data":"6718df08d591235c9f860fe07bfbb1e6d1f8ca98024f7a56bbce732dd9e9f5e5"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.402076 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" event={"ID":"699936b9-cebb-4705-8e8e-abfeae4705f6","Type":"ContainerStarted","Data":"2ef0efd34de647a838736ff6e757a3edb9666114046cde0b761c50d44cb104c7"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.402167 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" event={"ID":"699936b9-cebb-4705-8e8e-abfeae4705f6","Type":"ContainerStarted","Data":"bdd0b5922f48afeff9c56f6d88b9eada56a2599d7726ad78a92dafac12533142"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.404645 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" podStartSLOduration=129.404622146 podStartE2EDuration="2m9.404622146s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.404137283 +0000 UTC m=+147.871444771" watchObservedRunningTime="2025-12-05 20:44:37.404622146 +0000 UTC m=+147.871929634" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.423720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" event={"ID":"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e","Type":"ContainerStarted","Data":"4107e04495d261349e59227b12b5526933402ac294c3607df5154635967250cc"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.427732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.438804 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:37.938777561 +0000 UTC m=+148.406085049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.441756 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.451065 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mmqsc" event={"ID":"b713204a-efc1-4683-b9fc-6e026532376b","Type":"ContainerStarted","Data":"6bd99e410abd78423e8c838695abbf28e88055fc25ddee5d3e4b0e2ee2c82a0d"} Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.473836 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mjxxq" podStartSLOduration=129.47381903 podStartE2EDuration="2m9.47381903s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.452738691 +0000 UTC m=+147.920046209" watchObservedRunningTime="2025-12-05 20:44:37.47381903 +0000 UTC m=+147.941126518" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.474363 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-glgf7"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.474466 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsm2r" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.494375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.519648 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v44jl"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.529423 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.531161 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.031138982 +0000 UTC m=+148.498446470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.541099 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s47qc" podStartSLOduration=129.54107947 podStartE2EDuration="2m9.54107947s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:37.516797952 +0000 UTC m=+147.984105450" watchObservedRunningTime="2025-12-05 20:44:37.54107947 +0000 UTC m=+148.008386958" Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.562491 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.571351 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.621618 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ddtpz"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.631400 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.631672 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.131661462 +0000 UTC m=+148.598968940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: W1205 20:44:37.650642 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod283b35ea_91c4_4185_a639_9a8a1a80aaa7.slice/crio-f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825 WatchSource:0}: Error finding container f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825: Status 404 returned error can't find the container with id f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825 Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.653526 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-75r5s"] Dec 05 20:44:37 crc kubenswrapper[4747]: W1205 20:44:37.716036 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dab3d7d_2c4d_41e8_81ef_2fb18e258b4b.slice/crio-75d4b7d0efba91c5e14d54cd294970e44d20891015a1fa1b1a6ee97b47e0402c WatchSource:0}: Error finding container 75d4b7d0efba91c5e14d54cd294970e44d20891015a1fa1b1a6ee97b47e0402c: Status 404 returned error can't find the container with id 75d4b7d0efba91c5e14d54cd294970e44d20891015a1fa1b1a6ee97b47e0402c Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.732475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.732902 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.232881851 +0000 UTC m=+148.700189339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.747434 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.763375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.834314 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s"] Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.835909 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.836322 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.336299302 +0000 UTC m=+148.803606790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:37 crc kubenswrapper[4747]: W1205 20:44:37.851733 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-222f032618af5ca35dc4c160932c3a11c335e16a8097a23a88be729cab2e9cec WatchSource:0}: Error finding container 222f032618af5ca35dc4c160932c3a11c335e16a8097a23a88be729cab2e9cec: Status 404 returned error can't find the container with id 222f032618af5ca35dc4c160932c3a11c335e16a8097a23a88be729cab2e9cec Dec 05 20:44:37 crc kubenswrapper[4747]: I1205 20:44:37.937194 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:37 crc kubenswrapper[4747]: E1205 20:44:37.937491 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.43747449 +0000 UTC m=+148.904781978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.038811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.039643 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.539631225 +0000 UTC m=+149.006938713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.140636 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.140823 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.640792583 +0000 UTC m=+149.108100071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.152083 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.152542 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.652525651 +0000 UTC m=+149.119833149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.253220 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.253542 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.753495003 +0000 UTC m=+149.220802501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.253958 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.254412 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.754393998 +0000 UTC m=+149.221701486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.299371 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:38 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:38 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:38 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.299435 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.356982 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.357159 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.85713399 +0000 UTC m=+149.324441478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.358588 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.359092 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.859076204 +0000 UTC m=+149.326383692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.463554 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.463770 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.963730789 +0000 UTC m=+149.431038277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.464281 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.464677 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:38.964658985 +0000 UTC m=+149.431966483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.496162 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" event={"ID":"001b570f-e334-4e40-9014-857430c089be","Type":"ContainerStarted","Data":"f41cf2faa5525338ef5b50eec8a696f113f8255247b14cd1bd07935011ee4137"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.496221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" event={"ID":"001b570f-e334-4e40-9014-857430c089be","Type":"ContainerStarted","Data":"54878088405d0e23951bad7b8d4f5a68d85e5db46277e0b7737add842774e20d"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.498047 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.512312 4747 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vr9tk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.512421 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" podUID="001b570f-e334-4e40-9014-857430c089be" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.529164 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" podStartSLOduration=129.529145758 podStartE2EDuration="2m9.529145758s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.528030126 +0000 UTC m=+148.995337614" watchObservedRunningTime="2025-12-05 20:44:38.529145758 +0000 UTC m=+148.996453246" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.532206 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" event={"ID":"448cf9b6-9081-40c4-8984-a75fb61cd5dd","Type":"ContainerStarted","Data":"937175092d5d96caf265bb082aaf901e39db57c69b0bd845f1ae13ac2bdc1d17"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.532256 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" event={"ID":"448cf9b6-9081-40c4-8984-a75fb61cd5dd","Type":"ContainerStarted","Data":"f406fad95896aedc94b5f3702224c419197bb576c8500789392ff7a6670415cb"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.543126 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xx4lj" event={"ID":"342bdc3a-ef90-4837-b0d2-e5e9bc63b821","Type":"ContainerStarted","Data":"5aec3c3b0236d0781c96283046c91d9c0ad6e2ff7ed6b4e8b3c8fb89caf387a5"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.543374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xx4lj" event={"ID":"342bdc3a-ef90-4837-b0d2-e5e9bc63b821","Type":"ContainerStarted","Data":"c370ce2beb06c00ed0055e18a698369b79591d032738f275aa1ecd08104e07aa"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.544433 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.557046 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" event={"ID":"c7a9c2fc-a6db-4def-9938-f1da651566c8","Type":"ContainerStarted","Data":"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.557113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" event={"ID":"c7a9c2fc-a6db-4def-9938-f1da651566c8","Type":"ContainerStarted","Data":"ca29204fd4b307545b2042c0e04a088767b3c22226a8f6a820896be352dc550d"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.558549 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.565788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.567099 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.067081708 +0000 UTC m=+149.534389196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.568333 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" event={"ID":"280e109e-854a-430c-9960-6af6bba9dfdc","Type":"ContainerStarted","Data":"1f4d947ebc8db195a09aeb2041cae9301a2f0efd5b1e4bdbc73ef61a7e7cf1da"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.568383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" event={"ID":"280e109e-854a-430c-9960-6af6bba9dfdc","Type":"ContainerStarted","Data":"4b6d0fc48f7840773e5929e9e28a90369d6f4dbbdd550409b43efea8ce1a8827"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.572053 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svs8d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.572100 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.572835 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzc" podStartSLOduration=130.572824198 podStartE2EDuration="2m10.572824198s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.554318331 +0000 UTC m=+149.021625829" watchObservedRunningTime="2025-12-05 20:44:38.572824198 +0000 UTC m=+149.040131686" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.573252 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xx4lj" podStartSLOduration=7.5732489 podStartE2EDuration="7.5732489s" podCreationTimestamp="2025-12-05 20:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.571851901 +0000 UTC m=+149.039159389" watchObservedRunningTime="2025-12-05 20:44:38.5732489 +0000 UTC m=+149.040556388" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.592242 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" event={"ID":"a82b87a4-6069-4c33-9e0f-c2ef09d47781","Type":"ContainerStarted","Data":"aa52bcfe63d040f2440a0594c01521910057d52bbb555c4e4f5aed75e8df5fb5"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.592424 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" event={"ID":"a82b87a4-6069-4c33-9e0f-c2ef09d47781","Type":"ContainerStarted","Data":"5e71f1c065091a02a40ac4a77b953e86a03ea651e189e498f9ef7106149ef7df"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.600512 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" podStartSLOduration=129.600479081 podStartE2EDuration="2m9.600479081s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.599915576 +0000 UTC m=+149.067223064" watchObservedRunningTime="2025-12-05 20:44:38.600479081 +0000 UTC m=+149.067786569" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.669817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" event={"ID":"34d2d560-7ce6-4a88-8637-b99f3c3c367a","Type":"ContainerStarted","Data":"bbb21113776f37ed10c36312e2d17cf8df653de2f298e88fa4f73d09c2782b88"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.669873 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" event={"ID":"34d2d560-7ce6-4a88-8637-b99f3c3c367a","Type":"ContainerStarted","Data":"ffbfeaecde71cf55a98c99d531780ad72361e5c5d3c9ba3cfc487d258c792a90"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.674687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.675012 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.174998654 +0000 UTC m=+149.642306142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.683919 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mmqsc" event={"ID":"b713204a-efc1-4683-b9fc-6e026532376b","Type":"ContainerStarted","Data":"b43a5adaa15ca129a2e3717dc362571b3c050de62c0d94b75a5e3df18c744199"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.689741 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" event={"ID":"283b35ea-91c4-4185-a639-9a8a1a80aaa7","Type":"ContainerStarted","Data":"1856838e2e010e8e7a549fc5049197714e46ca623a2063c93f8fb7229536ea4e"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.689776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" event={"ID":"283b35ea-91c4-4185-a639-9a8a1a80aaa7","Type":"ContainerStarted","Data":"f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.697147 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xgqvl" podStartSLOduration=130.697129263 podStartE2EDuration="2m10.697129263s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.687893065 +0000 UTC m=+149.155200553" watchObservedRunningTime="2025-12-05 20:44:38.697129263 +0000 UTC m=+149.164436751" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.711196 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cecaf599f673ac791c07d517a675346f5cc2b9d28531327bbf6273ce911846b0"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.711243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"55ddf7d8cb0f8f32abc12661d031e8348130ea1077190f388bac37d7ef09cfca"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.713009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" event={"ID":"c64af899-04ea-4380-b287-4f3a2dc1bc5b","Type":"ContainerStarted","Data":"d5dcf32a05b625c3a03d6dd7558bd0374a0832a056aba65f6e55ddc9b829b80f"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.713951 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" event={"ID":"3a3fdfa4-f822-4da3-af90-ee5134c82485","Type":"ContainerStarted","Data":"200f83458dbac89f6846de91d0f2777ae7107b6e03d48880c1a286d4df0790ab"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.713977 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" event={"ID":"3a3fdfa4-f822-4da3-af90-ee5134c82485","Type":"ContainerStarted","Data":"32200871943adabb8710cefe96682dcd6434adf5ba4896d7b9242384f0961f90"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.714747 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.715664 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" event={"ID":"2d999cbd-6286-44a8-ab6b-8358cf8ac970","Type":"ContainerStarted","Data":"9007369bacc9e91242f242e8fb3881cdfa1a8bc74d0a5554524924328edce612"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.715690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" event={"ID":"2d999cbd-6286-44a8-ab6b-8358cf8ac970","Type":"ContainerStarted","Data":"82f98e68270e8f48e87ac9a61e39bdb2c81df233b0458ecb66cc219cc21b8402"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.725741 4747 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gxkvw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.725789 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" podUID="3a3fdfa4-f822-4da3-af90-ee5134c82485" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.759460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d88cp" event={"ID":"b672eb13-d702-4613-90d2-5d2406923876","Type":"ContainerStarted","Data":"5255926292e35bc730a88b24fa64469839553fbea3ad8aa8b5bae0f2efd137fd"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.771920 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d1cbaad72a1b2ba45052e8da743f7614be8eaf21fe585e1d2aaf307a07219b22"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.771980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"222f032618af5ca35dc4c160932c3a11c335e16a8097a23a88be729cab2e9cec"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.772208 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.773276 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-glgf7" podStartSLOduration=129.773253981 podStartE2EDuration="2m9.773253981s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.723655334 +0000 UTC m=+149.190962822" watchObservedRunningTime="2025-12-05 20:44:38.773253981 +0000 UTC m=+149.240561469" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.775569 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.777981 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.277964432 +0000 UTC m=+149.745271920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.791921 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" event={"ID":"fccb227e-229a-4054-a0ad-43dc8aa9f0a3","Type":"ContainerStarted","Data":"73b97fb580575dee0dc5d0bb13c8d5bb50fc030c55b584358deb244ae075a561"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.791970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" event={"ID":"fccb227e-229a-4054-a0ad-43dc8aa9f0a3","Type":"ContainerStarted","Data":"5f9ed986bfa10169883006cbd7e7a3773f0a6d271b6138ab74b7de61699fc882"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.806076 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pv9z" podStartSLOduration=130.806058348 podStartE2EDuration="2m10.806058348s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.769929978 +0000 UTC m=+149.237237466" watchObservedRunningTime="2025-12-05 20:44:38.806058348 +0000 UTC m=+149.273365836" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.806478 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mmqsc" podStartSLOduration=7.806474329 podStartE2EDuration="7.806474329s" podCreationTimestamp="2025-12-05 20:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.803932538 +0000 UTC m=+149.271240016" watchObservedRunningTime="2025-12-05 20:44:38.806474329 +0000 UTC m=+149.273781817" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.818847 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fddea942785e2da57cdb22dbc58d450dad71124bd9bf109096157556fdc57ddf"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.818896 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d1c960d1eec2cc189da161b294bb35285e44a658e991abb802473d26c333b1bb"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.835662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" event={"ID":"e6f561cd-d447-44b3-9680-d2a6938e0a7b","Type":"ContainerStarted","Data":"75ca8f9da4b0b93c0a38091086a01eadce539fd8e95f379d3be232f8d20a1efa"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.835706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" event={"ID":"e6f561cd-d447-44b3-9680-d2a6938e0a7b","Type":"ContainerStarted","Data":"784a1bf637dcd2300fbaf7df38d7542b9d4d93189476e97ac396ff9134f9d08f"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.836174 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" podStartSLOduration=130.836163879 podStartE2EDuration="2m10.836163879s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.834737079 +0000 UTC m=+149.302044567" watchObservedRunningTime="2025-12-05 20:44:38.836163879 +0000 UTC m=+149.303471367" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.836455 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.845259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" event={"ID":"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b","Type":"ContainerStarted","Data":"cc77306cd9e85f1a7b4b2e214805927bf8cf06f77888f32aee55757d99c835e7"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.845306 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" event={"ID":"2dab3d7d-2c4d-41e8-81ef-2fb18e258b4b","Type":"ContainerStarted","Data":"75d4b7d0efba91c5e14d54cd294970e44d20891015a1fa1b1a6ee97b47e0402c"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.855178 4747 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5zzxf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.855247 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" podUID="e6f561cd-d447-44b3-9680-d2a6938e0a7b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.867019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" event={"ID":"742f316e-5676-4459-bfd5-411abe809f23","Type":"ContainerStarted","Data":"c8e59b5ec7f052beb8c0f3d45b512e0a58e867dd67475ae682bdeceb4fa699d5"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.878304 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.879500 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.37948252 +0000 UTC m=+149.846790008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.910856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" event={"ID":"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e","Type":"ContainerStarted","Data":"8a72ec266a24fc3afe66181b778d2c662b1af49347a74177bed89b4dd10a5406"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.910903 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" event={"ID":"d2003fdd-5c5a-4f60-9920-bd2e736a0f2e","Type":"ContainerStarted","Data":"8e90f1d8a56fac1d4d4bb478c77c9577f205af3bb04a143da53922e890f601e8"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.913887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" event={"ID":"f8e654b5-d104-4c65-9653-d1f1c55c3c7e","Type":"ContainerStarted","Data":"6988413a8d7572043e4768bbbcb1f4b5b716c48492609635ff552d750209dc33"} Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.974853 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" podStartSLOduration=129.974834395 podStartE2EDuration="2m9.974834395s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.949232609 +0000 UTC m=+149.416540097" watchObservedRunningTime="2025-12-05 20:44:38.974834395 +0000 UTC m=+149.442141883" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.976017 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" podStartSLOduration=129.976010598 podStartE2EDuration="2m9.976010598s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:38.972431668 +0000 UTC m=+149.439739156" watchObservedRunningTime="2025-12-05 20:44:38.976010598 +0000 UTC m=+149.443318086" Dec 05 20:44:38 crc kubenswrapper[4747]: I1205 20:44:38.979764 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:38 crc kubenswrapper[4747]: E1205 20:44:38.987407 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.487380506 +0000 UTC m=+149.954687984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.045840 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xbg5h" podStartSLOduration=131.045827149 podStartE2EDuration="2m11.045827149s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:39.045074408 +0000 UTC m=+149.512381886" watchObservedRunningTime="2025-12-05 20:44:39.045827149 +0000 UTC m=+149.513134637" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.086439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.089883 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.58986715 +0000 UTC m=+150.057174638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.092837 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bd4xm" podStartSLOduration=131.092822123 podStartE2EDuration="2m11.092822123s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:39.092172885 +0000 UTC m=+149.559480373" watchObservedRunningTime="2025-12-05 20:44:39.092822123 +0000 UTC m=+149.560129611" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.093673 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-75r5s" podStartSLOduration=130.093667327 podStartE2EDuration="2m10.093667327s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:39.067099864 +0000 UTC m=+149.534407352" watchObservedRunningTime="2025-12-05 20:44:39.093667327 +0000 UTC m=+149.560974815" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.117499 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" podStartSLOduration=131.117479952 podStartE2EDuration="2m11.117479952s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:39.111750062 +0000 UTC m=+149.579057550" watchObservedRunningTime="2025-12-05 20:44:39.117479952 +0000 UTC m=+149.584787440" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.190199 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.190509 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.690494093 +0000 UTC m=+150.157801581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.291516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.291913 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.791896177 +0000 UTC m=+150.259203665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.296983 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:39 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:39 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:39 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.297036 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.393025 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.393265 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.89324624 +0000 UTC m=+150.360553728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.393719 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.394150 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.894129815 +0000 UTC m=+150.361437303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.495349 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.495564 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.995529399 +0000 UTC m=+150.462836897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.495643 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.495962 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:39.995949321 +0000 UTC m=+150.463256809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.597224 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.597403 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.097374056 +0000 UTC m=+150.564681544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.597963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.598294 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.098279521 +0000 UTC m=+150.565587019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.699509 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.699708 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.199679885 +0000 UTC m=+150.666987373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.699760 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.700053 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.200041715 +0000 UTC m=+150.667349203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.799289 4747 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.801266 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.801483 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.30145463 +0000 UTC m=+150.768762118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.801601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.802015 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.301996645 +0000 UTC m=+150.769304133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.902730 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.902908 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.402881445 +0000 UTC m=+150.870188933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.903028 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:39 crc kubenswrapper[4747]: E1205 20:44:39.903373 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.403360168 +0000 UTC m=+150.870667656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xxxxf" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.931084 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" event={"ID":"2d999cbd-6286-44a8-ab6b-8358cf8ac970","Type":"ContainerStarted","Data":"1b4f5e475f722810edfc3ff0e403600e7697122c12fb3a88bf52d565971a66ef"} Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.940431 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" event={"ID":"c64af899-04ea-4380-b287-4f3a2dc1bc5b","Type":"ContainerStarted","Data":"5eb042a7c503f8e72b00ee9e1b1c1b38044fe36e81fb4fbc1d1b240ace8a15df"} Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.940489 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" event={"ID":"c64af899-04ea-4380-b287-4f3a2dc1bc5b","Type":"ContainerStarted","Data":"dafc7831aed345408bb6338b7d057306caf596226a9e7b5247c830c816d9a938"} Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.954855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ssb48" event={"ID":"742f316e-5676-4459-bfd5-411abe809f23","Type":"ContainerStarted","Data":"9480ca78f125b100dbe765cb7b6c95959d091b6c2f47a6c2bbff1623201767fe"} Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.971727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" event={"ID":"fccb227e-229a-4054-a0ad-43dc8aa9f0a3","Type":"ContainerStarted","Data":"e79c759b2c11776e14fac59cd609470ee589e02c2994eb3b8874377053deaddd"} Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.971781 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.980843 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svs8d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.980889 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.982231 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5zzxf" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.992367 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vr9tk" Dec 05 20:44:39 crc kubenswrapper[4747]: I1205 20:44:39.993872 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gxkvw" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.016596 4747 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T20:44:39.799330141Z","Handler":null,"Name":""} Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.016941 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:40 crc kubenswrapper[4747]: E1205 20:44:40.017316 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:44:40.517298933 +0000 UTC m=+150.984606421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.033612 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-v44jl" podStartSLOduration=132.033594838 podStartE2EDuration="2m12.033594838s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:39.981301987 +0000 UTC m=+150.448609475" watchObservedRunningTime="2025-12-05 20:44:40.033594838 +0000 UTC m=+150.500902326" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.046881 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" podStartSLOduration=131.046861949 podStartE2EDuration="2m11.046861949s" podCreationTimestamp="2025-12-05 20:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:40.033331811 +0000 UTC m=+150.500639299" watchObservedRunningTime="2025-12-05 20:44:40.046861949 +0000 UTC m=+150.514169437" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.047261 4747 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.047394 4747 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.118166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.149223 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.149689 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.259362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xxxxf\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.281549 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.299536 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:40 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:40 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:40 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.299626 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.325147 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.338061 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.557136 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.694154 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.695383 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.706526 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.722797 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.834125 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.834553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbg5\" (UniqueName: \"kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.834630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.885311 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.886259 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.891559 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.892099 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.936131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.936216 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbg5\" (UniqueName: \"kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.936252 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.937121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.937354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.969508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbg5\" (UniqueName: \"kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5\") pod \"certified-operators-8hg9x\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.976908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" event={"ID":"da814908-e69b-476a-a5e8-7f128bb627b2","Type":"ContainerStarted","Data":"59b51efd897a34121c8b300b761364472982eff6e19f3fc7781c79b9c725953b"} Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.976966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" event={"ID":"da814908-e69b-476a-a5e8-7f128bb627b2","Type":"ContainerStarted","Data":"17d4c0557801b85f1d501daf24e7735a83ece84e7e8ee2134b62fe0bb7cf02da"} Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.977822 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.981351 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" event={"ID":"c64af899-04ea-4380-b287-4f3a2dc1bc5b","Type":"ContainerStarted","Data":"dabdd5c85c9b3a95e3224f825f647c6e035713c01335d46e271c9440938e0369"} Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.981472 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" event={"ID":"c64af899-04ea-4380-b287-4f3a2dc1bc5b","Type":"ContainerStarted","Data":"404f2d88cacebf6335d85b1d62d6194f2f215b05536bb51abf7f491e1dc6072b"} Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.986043 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:44:40 crc kubenswrapper[4747]: I1205 20:44:40.999523 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" podStartSLOduration=132.999499505 podStartE2EDuration="2m12.999499505s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:40.998477347 +0000 UTC m=+151.465784835" watchObservedRunningTime="2025-12-05 20:44:40.999499505 +0000 UTC m=+151.466806993" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.008219 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.035467 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ddtpz" podStartSLOduration=10.03543327 podStartE2EDuration="10.03543327s" podCreationTimestamp="2025-12-05 20:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:41.035370018 +0000 UTC m=+151.502677506" watchObservedRunningTime="2025-12-05 20:44:41.03543327 +0000 UTC m=+151.502740768" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.037701 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blmr\" (UniqueName: \"kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.037800 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.037831 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.084524 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.085742 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.093973 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.139619 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7blmr\" (UniqueName: \"kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.139764 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.139784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.141297 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.141515 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.176633 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blmr\" (UniqueName: \"kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr\") pod \"community-operators-g6p67\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.208030 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.245475 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.245574 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.245653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45qz\" (UniqueName: \"kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.279890 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.280840 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.286335 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.288714 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.302149 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:41 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:41 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:41 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.302327 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.346839 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.346935 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnk5x\" (UniqueName: \"kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.347162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45qz\" (UniqueName: \"kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.347362 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.347521 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.347550 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.348708 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.348788 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.365465 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45qz\" (UniqueName: \"kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz\") pod \"certified-operators-szprt\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.399405 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.406218 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.448371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.448933 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.448979 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnk5x\" (UniqueName: \"kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.449091 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.449358 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.468859 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnk5x\" (UniqueName: \"kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x\") pod \"community-operators-zfqc7\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.583106 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.608853 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.785233 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:44:41 crc kubenswrapper[4747]: W1205 20:44:41.798136 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8441fe24_a43e_4034_816d_f78a91f89025.slice/crio-a0db2ab60287c8eb37c2e4d08f55e1ff620c53cfd6a64cf9e82909f2cae5f406 WatchSource:0}: Error finding container a0db2ab60287c8eb37c2e4d08f55e1ff620c53cfd6a64cf9e82909f2cae5f406: Status 404 returned error can't find the container with id a0db2ab60287c8eb37c2e4d08f55e1ff620c53cfd6a64cf9e82909f2cae5f406 Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.801678 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.806248 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-f8pt2" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.862674 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.881922 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hrsvp" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.882241 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.882421 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.886313 4747 patch_prober.go:28] interesting pod/console-f9d7485db-rb4b9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.886370 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rb4b9" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.988937 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerStarted","Data":"a0db2ab60287c8eb37c2e4d08f55e1ff620c53cfd6a64cf9e82909f2cae5f406"} Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.991387 4747 generic.go:334] "Generic (PLEG): container finished" podID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerID="5d5af78e50cca733543bb264d72af56a61d3b100a52adcc8da8f4f1ef1515a6b" exitCode=0 Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.991698 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerDied","Data":"5d5af78e50cca733543bb264d72af56a61d3b100a52adcc8da8f4f1ef1515a6b"} Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.991747 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerStarted","Data":"5ed0d4b93bb9e47f01e3b9c0be692832a8c8464ab92e0bd44d3e9a27d5e2fc50"} Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.993663 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.995169 4747 generic.go:334] "Generic (PLEG): container finished" podID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerID="807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b" exitCode=0 Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.995223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerDied","Data":"807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b"} Dec 05 20:44:41 crc kubenswrapper[4747]: I1205 20:44:41.995244 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerStarted","Data":"587a55990fca71b41a4e94eae5fb85c13acfeafa524c0999f5a090c01586a8da"} Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:41.999720 4747 generic.go:334] "Generic (PLEG): container finished" podID="283b35ea-91c4-4185-a639-9a8a1a80aaa7" containerID="1856838e2e010e8e7a549fc5049197714e46ca623a2063c93f8fb7229536ea4e" exitCode=0 Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:41.999800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" event={"ID":"283b35ea-91c4-4185-a639-9a8a1a80aaa7","Type":"ContainerDied","Data":"1856838e2e010e8e7a549fc5049197714e46ca623a2063c93f8fb7229536ea4e"} Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.003324 4747 generic.go:334] "Generic (PLEG): container finished" podID="6dbcd101-ff16-4970-8433-37b2576e551b" containerID="ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989" exitCode=0 Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.005283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerDied","Data":"ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989"} Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.005336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerStarted","Data":"2d51a3edaea1e4d19cd0208b85d37176ef750c7c0421ef151cc4a1c1d83dc1f7"} Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.296908 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:42 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:42 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:42 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.296966 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.703854 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.709203 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.713059 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.721290 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.804788 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.805288 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.805330 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.907213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.907379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.907419 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.908451 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.916263 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:42 crc kubenswrapper[4747]: I1205 20:44:42.929513 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn\") pod \"redhat-marketplace-q278l\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.035685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.046530 4747 generic.go:334] "Generic (PLEG): container finished" podID="8441fe24-a43e-4034-816d-f78a91f89025" containerID="7f5f1bf1f8907425d535b7fd83026f65aa3497e366a2ee5cc3447e0d0f62a55c" exitCode=0 Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.046702 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerDied","Data":"7f5f1bf1f8907425d535b7fd83026f65aa3497e366a2ee5cc3447e0d0f62a55c"} Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.114841 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.116235 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.139332 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.227447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.227789 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrwt\" (UniqueName: \"kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.227838 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.297002 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:44:43 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Dec 05 20:44:43 crc kubenswrapper[4747]: [+]process-running ok Dec 05 20:44:43 crc kubenswrapper[4747]: healthz check failed Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.297077 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.330245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrwt\" (UniqueName: \"kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.330630 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.330659 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.331489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.332080 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.358020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrwt\" (UniqueName: \"kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt\") pod \"redhat-marketplace-mhk4l\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.436702 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.446641 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.531894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjlzb\" (UniqueName: \"kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb\") pod \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.531981 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume\") pod \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.532046 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume\") pod \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\" (UID: \"283b35ea-91c4-4185-a639-9a8a1a80aaa7\") " Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.533808 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume" (OuterVolumeSpecName: "config-volume") pod "283b35ea-91c4-4185-a639-9a8a1a80aaa7" (UID: "283b35ea-91c4-4185-a639-9a8a1a80aaa7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.537684 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb" (OuterVolumeSpecName: "kube-api-access-pjlzb") pod "283b35ea-91c4-4185-a639-9a8a1a80aaa7" (UID: "283b35ea-91c4-4185-a639-9a8a1a80aaa7"). InnerVolumeSpecName "kube-api-access-pjlzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.537811 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "283b35ea-91c4-4185-a639-9a8a1a80aaa7" (UID: "283b35ea-91c4-4185-a639-9a8a1a80aaa7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.634879 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/283b35ea-91c4-4185-a639-9a8a1a80aaa7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.634965 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjlzb\" (UniqueName: \"kubernetes.io/projected/283b35ea-91c4-4185-a639-9a8a1a80aaa7-kube-api-access-pjlzb\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.634975 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/283b35ea-91c4-4185-a639-9a8a1a80aaa7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.682857 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.750643 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:44:43 crc kubenswrapper[4747]: W1205 20:44:43.792218 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ae41a7_9df3_4aeb_9ffe_b8bfb3bb413b.slice/crio-640ec5944e59587087fdd6351b44b440597e1e37288e0924ed24a8b0d5b5dbf4 WatchSource:0}: Error finding container 640ec5944e59587087fdd6351b44b440597e1e37288e0924ed24a8b0d5b5dbf4: Status 404 returned error can't find the container with id 640ec5944e59587087fdd6351b44b440597e1e37288e0924ed24a8b0d5b5dbf4 Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.884959 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:44:43 crc kubenswrapper[4747]: E1205 20:44:43.885209 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283b35ea-91c4-4185-a639-9a8a1a80aaa7" containerName="collect-profiles" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.885223 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="283b35ea-91c4-4185-a639-9a8a1a80aaa7" containerName="collect-profiles" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.885412 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="283b35ea-91c4-4185-a639-9a8a1a80aaa7" containerName="collect-profiles" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.888226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.892029 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:44:43 crc kubenswrapper[4747]: I1205 20:44:43.892942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.040266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.040442 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glj77\" (UniqueName: \"kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.040475 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.081194 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" event={"ID":"283b35ea-91c4-4185-a639-9a8a1a80aaa7","Type":"ContainerDied","Data":"f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825"} Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.081237 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f04ac640de5704c439d9e732ec6b862c19153c2f9116213d8f7842c70e575825" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.081289 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.084485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerStarted","Data":"8ea76ffa0e0f9366fed4983dbfdbfab810d3894af2abcffd1a86de7e09892312"} Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.086406 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerStarted","Data":"640ec5944e59587087fdd6351b44b440597e1e37288e0924ed24a8b0d5b5dbf4"} Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.141574 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glj77\" (UniqueName: \"kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.141664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.142304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.142321 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.142357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.162818 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glj77\" (UniqueName: \"kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77\") pod \"redhat-operators-g28vx\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.211294 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.288066 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.289450 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.297539 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.300483 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.302701 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.446842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.447044 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjt5h\" (UniqueName: \"kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.447082 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.515138 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.547904 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.547971 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.548027 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjt5h\" (UniqueName: \"kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.549397 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.549661 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.566994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjt5h\" (UniqueName: \"kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h\") pod \"redhat-operators-jm5r9\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: W1205 20:44:44.598873 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a99c798_6879_43bb_817c_621364a56b5a.slice/crio-8839d2bfcec0ae27284e273aa74a3e71e35b0f066cca165c63a794d018a6894c WatchSource:0}: Error finding container 8839d2bfcec0ae27284e273aa74a3e71e35b0f066cca165c63a794d018a6894c: Status 404 returned error can't find the container with id 8839d2bfcec0ae27284e273aa74a3e71e35b0f066cca165c63a794d018a6894c Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.618683 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.645014 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.645779 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.648352 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.648480 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.657283 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.750396 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.750432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.851324 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.851371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.851455 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.874852 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:44:44 crc kubenswrapper[4747]: I1205 20:44:44.884500 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:44 crc kubenswrapper[4747]: W1205 20:44:44.900379 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd788fb_fa63_485f_bcfa_be929ef73305.slice/crio-024c7cea0e013199a941cda90192b2bee9a7d16ee1c84f211ea1b319b8cca95d WatchSource:0}: Error finding container 024c7cea0e013199a941cda90192b2bee9a7d16ee1c84f211ea1b319b8cca95d: Status 404 returned error can't find the container with id 024c7cea0e013199a941cda90192b2bee9a7d16ee1c84f211ea1b319b8cca95d Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.015677 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.100322 4747 generic.go:334] "Generic (PLEG): container finished" podID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerID="935feb09f4e788f3f1205ba09819cb732b5a0d7cdd53e83c62d11a753587aa0e" exitCode=0 Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.100802 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerDied","Data":"935feb09f4e788f3f1205ba09819cb732b5a0d7cdd53e83c62d11a753587aa0e"} Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.102050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerStarted","Data":"8839d2bfcec0ae27284e273aa74a3e71e35b0f066cca165c63a794d018a6894c"} Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.108183 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerID="74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31" exitCode=0 Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.108277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerDied","Data":"74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31"} Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.111481 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerStarted","Data":"024c7cea0e013199a941cda90192b2bee9a7d16ee1c84f211ea1b319b8cca95d"} Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.124024 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2p8p6" Dec 05 20:44:45 crc kubenswrapper[4747]: I1205 20:44:45.354117 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:44:45 crc kubenswrapper[4747]: W1205 20:44:45.454302 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod775cd61a_7c0f_40f3_9723_df672e255d4e.slice/crio-dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461 WatchSource:0}: Error finding container dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461: Status 404 returned error can't find the container with id dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461 Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.124227 4747 generic.go:334] "Generic (PLEG): container finished" podID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerID="e57ca75d80c933984a6a5c8cec60e58a98050bc66c3f4441a769bb51ccf9f88e" exitCode=0 Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.124315 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerDied","Data":"e57ca75d80c933984a6a5c8cec60e58a98050bc66c3f4441a769bb51ccf9f88e"} Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.130373 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"775cd61a-7c0f-40f3-9723-df672e255d4e","Type":"ContainerStarted","Data":"dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461"} Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.132180 4747 generic.go:334] "Generic (PLEG): container finished" podID="8a99c798-6879-43bb-817c-621364a56b5a" containerID="badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973" exitCode=0 Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.132268 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerDied","Data":"badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973"} Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.989375 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.990693 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.994002 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:44:46 crc kubenswrapper[4747]: I1205 20:44:46.994254 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:46.997044 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.095652 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.095842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.139998 4747 generic.go:334] "Generic (PLEG): container finished" podID="775cd61a-7c0f-40f3-9723-df672e255d4e" containerID="0382172bd951c139862dd87fb3ea3edd848c4c3619cf735839a7f70ea11ef03f" exitCode=0 Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.140045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"775cd61a-7c0f-40f3-9723-df672e255d4e","Type":"ContainerDied","Data":"0382172bd951c139862dd87fb3ea3edd848c4c3619cf735839a7f70ea11ef03f"} Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.198190 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.198262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.198668 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.225356 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.311947 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:44:47 crc kubenswrapper[4747]: I1205 20:44:47.746106 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.149697 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f2c6617-7083-48a3-b1f9-b35880f70aeb","Type":"ContainerStarted","Data":"b225559ac5e2cea39af59987cde5c01137fbc69ed83114e2281475a527d62365"} Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.523114 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.634172 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir\") pod \"775cd61a-7c0f-40f3-9723-df672e255d4e\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.634313 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access\") pod \"775cd61a-7c0f-40f3-9723-df672e255d4e\" (UID: \"775cd61a-7c0f-40f3-9723-df672e255d4e\") " Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.635293 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "775cd61a-7c0f-40f3-9723-df672e255d4e" (UID: "775cd61a-7c0f-40f3-9723-df672e255d4e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.655003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "775cd61a-7c0f-40f3-9723-df672e255d4e" (UID: "775cd61a-7c0f-40f3-9723-df672e255d4e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.736469 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/775cd61a-7c0f-40f3-9723-df672e255d4e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:48 crc kubenswrapper[4747]: I1205 20:44:48.736556 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/775cd61a-7c0f-40f3-9723-df672e255d4e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:49 crc kubenswrapper[4747]: I1205 20:44:49.159027 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"775cd61a-7c0f-40f3-9723-df672e255d4e","Type":"ContainerDied","Data":"dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461"} Dec 05 20:44:49 crc kubenswrapper[4747]: I1205 20:44:49.159393 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfbc10dbb329d45a181a38c477c8cffed86bd2732b160046258bffd384092461" Dec 05 20:44:49 crc kubenswrapper[4747]: I1205 20:44:49.159091 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:44:49 crc kubenswrapper[4747]: I1205 20:44:49.478853 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xx4lj" Dec 05 20:44:50 crc kubenswrapper[4747]: I1205 20:44:50.166613 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f2c6617-7083-48a3-b1f9-b35880f70aeb","Type":"ContainerStarted","Data":"c77f0f2ab6aadbdad5ae93f19cdcc0924c1b168a57a0a4b14b502c43052d4025"} Dec 05 20:44:50 crc kubenswrapper[4747]: I1205 20:44:50.185080 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.185065579 podStartE2EDuration="4.185065579s" podCreationTimestamp="2025-12-05 20:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:44:50.183994839 +0000 UTC m=+160.651302327" watchObservedRunningTime="2025-12-05 20:44:50.185065579 +0000 UTC m=+160.652373067" Dec 05 20:44:50 crc kubenswrapper[4747]: I1205 20:44:50.571606 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:50 crc kubenswrapper[4747]: I1205 20:44:50.576854 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e860ee9-69f5-44a1-b414-deab4f78dd0d-metrics-certs\") pod \"network-metrics-daemon-dcr49\" (UID: \"1e860ee9-69f5-44a1-b414-deab4f78dd0d\") " pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:50 crc kubenswrapper[4747]: I1205 20:44:50.867529 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dcr49" Dec 05 20:44:51 crc kubenswrapper[4747]: I1205 20:44:51.175474 4747 generic.go:334] "Generic (PLEG): container finished" podID="8f2c6617-7083-48a3-b1f9-b35880f70aeb" containerID="c77f0f2ab6aadbdad5ae93f19cdcc0924c1b168a57a0a4b14b502c43052d4025" exitCode=0 Dec 05 20:44:51 crc kubenswrapper[4747]: I1205 20:44:51.175522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f2c6617-7083-48a3-b1f9-b35880f70aeb","Type":"ContainerDied","Data":"c77f0f2ab6aadbdad5ae93f19cdcc0924c1b168a57a0a4b14b502c43052d4025"} Dec 05 20:44:51 crc kubenswrapper[4747]: I1205 20:44:51.884048 4747 patch_prober.go:28] interesting pod/console-f9d7485db-rb4b9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 05 20:44:51 crc kubenswrapper[4747]: I1205 20:44:51.884115 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rb4b9" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.134444 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth"] Dec 05 20:45:00 crc kubenswrapper[4747]: E1205 20:45:00.135383 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775cd61a-7c0f-40f3-9723-df672e255d4e" containerName="pruner" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.135398 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="775cd61a-7c0f-40f3-9723-df672e255d4e" containerName="pruner" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.135503 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="775cd61a-7c0f-40f3-9723-df672e255d4e" containerName="pruner" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.135960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.138103 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.138399 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.149032 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth"] Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.227746 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.227805 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s49zq\" (UniqueName: \"kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.228031 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.288680 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.328738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.328818 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.328848 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s49zq\" (UniqueName: \"kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.334992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.345457 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.348328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s49zq\" (UniqueName: \"kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq\") pod \"collect-profiles-29416125-jccth\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:00 crc kubenswrapper[4747]: I1205 20:45:00.464153 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.804481 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.899160 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.902934 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.979556 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access\") pod \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.979620 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir\") pod \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\" (UID: \"8f2c6617-7083-48a3-b1f9-b35880f70aeb\") " Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.979797 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f2c6617-7083-48a3-b1f9-b35880f70aeb" (UID: "8f2c6617-7083-48a3-b1f9-b35880f70aeb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.979861 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:01 crc kubenswrapper[4747]: I1205 20:45:01.988181 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f2c6617-7083-48a3-b1f9-b35880f70aeb" (UID: "8f2c6617-7083-48a3-b1f9-b35880f70aeb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:02 crc kubenswrapper[4747]: I1205 20:45:02.080783 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f2c6617-7083-48a3-b1f9-b35880f70aeb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:02 crc kubenswrapper[4747]: I1205 20:45:02.274732 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:45:02 crc kubenswrapper[4747]: I1205 20:45:02.283284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f2c6617-7083-48a3-b1f9-b35880f70aeb","Type":"ContainerDied","Data":"b225559ac5e2cea39af59987cde5c01137fbc69ed83114e2281475a527d62365"} Dec 05 20:45:02 crc kubenswrapper[4747]: I1205 20:45:02.283334 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b225559ac5e2cea39af59987cde5c01137fbc69ed83114e2281475a527d62365" Dec 05 20:45:06 crc kubenswrapper[4747]: I1205 20:45:06.224135 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:45:06 crc kubenswrapper[4747]: I1205 20:45:06.224603 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.088684 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth"] Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.091523 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dcr49"] Dec 05 20:45:11 crc kubenswrapper[4747]: E1205 20:45:11.221665 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fca7ed3_fb68_43bd_8186_7d73d673098b.slice/crio-conmon-5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ae41a7_9df3_4aeb_9ffe_b8bfb3bb413b.slice/crio-e6161bd3b0d65f9ed856307303873a06cf97266ac7450738b351cd615ad95237.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ae41a7_9df3_4aeb_9ffe_b8bfb3bb413b.slice/crio-conmon-e6161bd3b0d65f9ed856307303873a06cf97266ac7450738b351cd615ad95237.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.325826 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerID="5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6" exitCode=0 Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.325885 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerDied","Data":"5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.329630 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dcr49" event={"ID":"1e860ee9-69f5-44a1-b414-deab4f78dd0d","Type":"ContainerStarted","Data":"98674b6a9b975a65db8adb1deb1113836b2a9129134e7a85b31ca2b3b9f2a8ee"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.337160 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerStarted","Data":"aa0f45c405df6a398978b1b872495a49d66a660df9d5ca96e2e2daad58b37442"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.339542 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerStarted","Data":"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.345161 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerStarted","Data":"97caefb62daef8167e2b3b8326305449dbca1360d7a2df6d967333eec408aa06"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.347631 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" event={"ID":"6dfcc851-a0fe-45be-9752-82a11a1d06fa","Type":"ContainerStarted","Data":"212ab26ef8def01ba0c512ff8c82e6f40556de4c5edd52ae42e8939373a2c8fd"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.354319 4747 generic.go:334] "Generic (PLEG): container finished" podID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerID="e6161bd3b0d65f9ed856307303873a06cf97266ac7450738b351cd615ad95237" exitCode=0 Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.354414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerDied","Data":"e6161bd3b0d65f9ed856307303873a06cf97266ac7450738b351cd615ad95237"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.362085 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerStarted","Data":"619a15383361c33b9c5b4c688c55aa9d303ae03a7daed889a1704305c1b4c2e8"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.366113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerStarted","Data":"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8"} Dec 05 20:45:11 crc kubenswrapper[4747]: I1205 20:45:11.394698 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerStarted","Data":"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.401661 4747 generic.go:334] "Generic (PLEG): container finished" podID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerID="aa0f45c405df6a398978b1b872495a49d66a660df9d5ca96e2e2daad58b37442" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.402181 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerDied","Data":"aa0f45c405df6a398978b1b872495a49d66a660df9d5ca96e2e2daad58b37442"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.405658 4747 generic.go:334] "Generic (PLEG): container finished" podID="6dbcd101-ff16-4970-8433-37b2576e551b" containerID="695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.405752 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerDied","Data":"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.409437 4747 generic.go:334] "Generic (PLEG): container finished" podID="8441fe24-a43e-4034-816d-f78a91f89025" containerID="97caefb62daef8167e2b3b8326305449dbca1360d7a2df6d967333eec408aa06" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.409536 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerDied","Data":"97caefb62daef8167e2b3b8326305449dbca1360d7a2df6d967333eec408aa06"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.412332 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" event={"ID":"6dfcc851-a0fe-45be-9752-82a11a1d06fa","Type":"ContainerStarted","Data":"0f778b00b91e75dd01cfde3d64c4a954b25a568b6a159983c2f43dd16d423ea1"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.414725 4747 generic.go:334] "Generic (PLEG): container finished" podID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerID="619a15383361c33b9c5b4c688c55aa9d303ae03a7daed889a1704305c1b4c2e8" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.414836 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerDied","Data":"619a15383361c33b9c5b4c688c55aa9d303ae03a7daed889a1704305c1b4c2e8"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.418050 4747 generic.go:334] "Generic (PLEG): container finished" podID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerID="a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.418114 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerDied","Data":"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.421142 4747 generic.go:334] "Generic (PLEG): container finished" podID="8a99c798-6879-43bb-817c-621364a56b5a" containerID="c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81" exitCode=0 Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.421186 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerDied","Data":"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81"} Dec 05 20:45:12 crc kubenswrapper[4747]: I1205 20:45:12.568095 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:45:13 crc kubenswrapper[4747]: I1205 20:45:13.431051 4747 generic.go:334] "Generic (PLEG): container finished" podID="6dfcc851-a0fe-45be-9752-82a11a1d06fa" containerID="0f778b00b91e75dd01cfde3d64c4a954b25a568b6a159983c2f43dd16d423ea1" exitCode=0 Dec 05 20:45:13 crc kubenswrapper[4747]: I1205 20:45:13.431136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" event={"ID":"6dfcc851-a0fe-45be-9752-82a11a1d06fa","Type":"ContainerDied","Data":"0f778b00b91e75dd01cfde3d64c4a954b25a568b6a159983c2f43dd16d423ea1"} Dec 05 20:45:13 crc kubenswrapper[4747]: I1205 20:45:13.435547 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dcr49" event={"ID":"1e860ee9-69f5-44a1-b414-deab4f78dd0d","Type":"ContainerStarted","Data":"6e061b0ff02fd8497253b5b930ce107c3d97625987b44fffaa2c8dde9fd4506b"} Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.443593 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dcr49" event={"ID":"1e860ee9-69f5-44a1-b414-deab4f78dd0d","Type":"ContainerStarted","Data":"ef091efda4ca96f0c13f5124c21db78594737b6def610d77a4ba388e18db12ec"} Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.463570 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dcr49" podStartSLOduration=166.463540998 podStartE2EDuration="2m46.463540998s" podCreationTimestamp="2025-12-05 20:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:45:14.462693424 +0000 UTC m=+184.930000922" watchObservedRunningTime="2025-12-05 20:45:14.463540998 +0000 UTC m=+184.930848486" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.687500 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.860944 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume\") pod \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.861104 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s49zq\" (UniqueName: \"kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq\") pod \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.861170 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume\") pod \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\" (UID: \"6dfcc851-a0fe-45be-9752-82a11a1d06fa\") " Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.861767 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "6dfcc851-a0fe-45be-9752-82a11a1d06fa" (UID: "6dfcc851-a0fe-45be-9752-82a11a1d06fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.866652 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq" (OuterVolumeSpecName: "kube-api-access-s49zq") pod "6dfcc851-a0fe-45be-9752-82a11a1d06fa" (UID: "6dfcc851-a0fe-45be-9752-82a11a1d06fa"). InnerVolumeSpecName "kube-api-access-s49zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.868689 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6dfcc851-a0fe-45be-9752-82a11a1d06fa" (UID: "6dfcc851-a0fe-45be-9752-82a11a1d06fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.962398 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s49zq\" (UniqueName: \"kubernetes.io/projected/6dfcc851-a0fe-45be-9752-82a11a1d06fa-kube-api-access-s49zq\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.962451 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dfcc851-a0fe-45be-9752-82a11a1d06fa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:14 crc kubenswrapper[4747]: I1205 20:45:14.962478 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6dfcc851-a0fe-45be-9752-82a11a1d06fa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:15 crc kubenswrapper[4747]: I1205 20:45:15.041383 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ncr5s" Dec 05 20:45:15 crc kubenswrapper[4747]: I1205 20:45:15.450925 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" Dec 05 20:45:15 crc kubenswrapper[4747]: I1205 20:45:15.452452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth" event={"ID":"6dfcc851-a0fe-45be-9752-82a11a1d06fa","Type":"ContainerDied","Data":"212ab26ef8def01ba0c512ff8c82e6f40556de4c5edd52ae42e8939373a2c8fd"} Dec 05 20:45:15 crc kubenswrapper[4747]: I1205 20:45:15.452495 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212ab26ef8def01ba0c512ff8c82e6f40556de4c5edd52ae42e8939373a2c8fd" Dec 05 20:45:16 crc kubenswrapper[4747]: I1205 20:45:16.141029 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:45:18 crc kubenswrapper[4747]: I1205 20:45:18.468678 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerStarted","Data":"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95"} Dec 05 20:45:19 crc kubenswrapper[4747]: I1205 20:45:19.490964 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q278l" podStartSLOduration=4.964270525 podStartE2EDuration="37.490946207s" podCreationTimestamp="2025-12-05 20:44:42 +0000 UTC" firstStartedPulling="2025-12-05 20:44:45.109678268 +0000 UTC m=+155.576985756" lastFinishedPulling="2025-12-05 20:45:17.63635391 +0000 UTC m=+188.103661438" observedRunningTime="2025-12-05 20:45:19.490032492 +0000 UTC m=+189.957339990" watchObservedRunningTime="2025-12-05 20:45:19.490946207 +0000 UTC m=+189.958253705" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.037368 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.037687 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.130721 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.508946 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerStarted","Data":"f3efdbfa7d1666b063e1a1a8328f4551a85604797b4003672be140eeda4f1e7c"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.511292 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerStarted","Data":"a32e10379bf17b5ee06e42fd21d60f03d0b4e6065a1023bb745b75c04da1e97c"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.514288 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerStarted","Data":"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.517177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerStarted","Data":"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.528347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerStarted","Data":"d0b9a9aeb01e3a4ce65b821d2b51290abb1e9624e70ff173b0d7bbb99a49d7af"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.536121 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerStarted","Data":"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.540668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerStarted","Data":"4bd30eff53dec573e91807cc1b41afaf6424af7bf55585ca5c05bffc40add6c3"} Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.580963 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zfqc7" podStartSLOduration=3.003945032 podStartE2EDuration="42.580911465s" podCreationTimestamp="2025-12-05 20:44:41 +0000 UTC" firstStartedPulling="2025-12-05 20:44:43.055990746 +0000 UTC m=+153.523298234" lastFinishedPulling="2025-12-05 20:45:22.632957179 +0000 UTC m=+193.100264667" observedRunningTime="2025-12-05 20:45:23.546837772 +0000 UTC m=+194.014145290" watchObservedRunningTime="2025-12-05 20:45:23.580911465 +0000 UTC m=+194.048218953" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.581474 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8hg9x" podStartSLOduration=3.053016933 podStartE2EDuration="43.58146886s" podCreationTimestamp="2025-12-05 20:44:40 +0000 UTC" firstStartedPulling="2025-12-05 20:44:41.997459319 +0000 UTC m=+152.464766807" lastFinishedPulling="2025-12-05 20:45:22.525911246 +0000 UTC m=+192.993218734" observedRunningTime="2025-12-05 20:45:23.58037669 +0000 UTC m=+194.047684188" watchObservedRunningTime="2025-12-05 20:45:23.58146886 +0000 UTC m=+194.048776348" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.596089 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.610436 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jm5r9" podStartSLOduration=3.237135592 podStartE2EDuration="39.610406729s" podCreationTimestamp="2025-12-05 20:44:44 +0000 UTC" firstStartedPulling="2025-12-05 20:44:46.12654443 +0000 UTC m=+156.593851918" lastFinishedPulling="2025-12-05 20:45:22.499815567 +0000 UTC m=+192.967123055" observedRunningTime="2025-12-05 20:45:23.606407737 +0000 UTC m=+194.073715225" watchObservedRunningTime="2025-12-05 20:45:23.610406729 +0000 UTC m=+194.077714217" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.689195 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-szprt" podStartSLOduration=2.168891933 podStartE2EDuration="42.689170891s" podCreationTimestamp="2025-12-05 20:44:41 +0000 UTC" firstStartedPulling="2025-12-05 20:44:41.993390136 +0000 UTC m=+152.460697624" lastFinishedPulling="2025-12-05 20:45:22.513669094 +0000 UTC m=+192.980976582" observedRunningTime="2025-12-05 20:45:23.664169412 +0000 UTC m=+194.131476900" watchObservedRunningTime="2025-12-05 20:45:23.689170891 +0000 UTC m=+194.156478379" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.691380 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6p67" podStartSLOduration=2.93951158 podStartE2EDuration="43.691373812s" podCreationTimestamp="2025-12-05 20:44:40 +0000 UTC" firstStartedPulling="2025-12-05 20:44:42.00608102 +0000 UTC m=+152.473388508" lastFinishedPulling="2025-12-05 20:45:22.757943252 +0000 UTC m=+193.225250740" observedRunningTime="2025-12-05 20:45:23.689356086 +0000 UTC m=+194.156663574" watchObservedRunningTime="2025-12-05 20:45:23.691373812 +0000 UTC m=+194.158681290" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.718007 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g28vx" podStartSLOduration=4.2452982 podStartE2EDuration="40.717988876s" podCreationTimestamp="2025-12-05 20:44:43 +0000 UTC" firstStartedPulling="2025-12-05 20:44:46.149108901 +0000 UTC m=+156.616416389" lastFinishedPulling="2025-12-05 20:45:22.621799557 +0000 UTC m=+193.089107065" observedRunningTime="2025-12-05 20:45:23.715551018 +0000 UTC m=+194.182858506" watchObservedRunningTime="2025-12-05 20:45:23.717988876 +0000 UTC m=+194.185296364" Dec 05 20:45:23 crc kubenswrapper[4747]: I1205 20:45:23.745706 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhk4l" podStartSLOduration=3.390129567 podStartE2EDuration="40.74567782s" podCreationTimestamp="2025-12-05 20:44:43 +0000 UTC" firstStartedPulling="2025-12-05 20:44:45.102468236 +0000 UTC m=+155.569775724" lastFinishedPulling="2025-12-05 20:45:22.458016479 +0000 UTC m=+192.925323977" observedRunningTime="2025-12-05 20:45:23.743250762 +0000 UTC m=+194.210558250" watchObservedRunningTime="2025-12-05 20:45:23.74567782 +0000 UTC m=+194.212985308" Dec 05 20:45:24 crc kubenswrapper[4747]: I1205 20:45:24.212895 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:45:24 crc kubenswrapper[4747]: I1205 20:45:24.212964 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:45:24 crc kubenswrapper[4747]: I1205 20:45:24.619196 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:24 crc kubenswrapper[4747]: I1205 20:45:24.619255 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.256391 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g28vx" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="registry-server" probeResult="failure" output=< Dec 05 20:45:25 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 20:45:25 crc kubenswrapper[4747]: > Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.500857 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:45:25 crc kubenswrapper[4747]: E1205 20:45:25.501111 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2c6617-7083-48a3-b1f9-b35880f70aeb" containerName="pruner" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.501127 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2c6617-7083-48a3-b1f9-b35880f70aeb" containerName="pruner" Dec 05 20:45:25 crc kubenswrapper[4747]: E1205 20:45:25.501141 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfcc851-a0fe-45be-9752-82a11a1d06fa" containerName="collect-profiles" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.501150 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfcc851-a0fe-45be-9752-82a11a1d06fa" containerName="collect-profiles" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.501276 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f2c6617-7083-48a3-b1f9-b35880f70aeb" containerName="pruner" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.501296 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfcc851-a0fe-45be-9752-82a11a1d06fa" containerName="collect-profiles" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.501771 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.504626 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.504999 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.509187 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.617885 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.617948 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.662086 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jm5r9" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="registry-server" probeResult="failure" output=< Dec 05 20:45:25 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 20:45:25 crc kubenswrapper[4747]: > Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.718991 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.719126 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.720976 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.745962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:25 crc kubenswrapper[4747]: I1205 20:45:25.831165 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:26 crc kubenswrapper[4747]: I1205 20:45:26.143192 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:45:26 crc kubenswrapper[4747]: W1205 20:45:26.153913 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod93b006c1_39ff_4055_afbb_17046c6eed26.slice/crio-c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59 WatchSource:0}: Error finding container c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59: Status 404 returned error can't find the container with id c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59 Dec 05 20:45:26 crc kubenswrapper[4747]: I1205 20:45:26.556274 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"93b006c1-39ff-4055-afbb-17046c6eed26","Type":"ContainerStarted","Data":"c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59"} Dec 05 20:45:29 crc kubenswrapper[4747]: I1205 20:45:29.579575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"93b006c1-39ff-4055-afbb-17046c6eed26","Type":"ContainerStarted","Data":"061ba7e8bed1d394dd31e3457bd5389c0730daa176521ac8d207e9640ccecfbd"} Dec 05 20:45:29 crc kubenswrapper[4747]: I1205 20:45:29.606699 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.6066685419999995 podStartE2EDuration="4.606668542s" podCreationTimestamp="2025-12-05 20:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:45:29.604252528 +0000 UTC m=+200.071560056" watchObservedRunningTime="2025-12-05 20:45:29.606668542 +0000 UTC m=+200.073976080" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.009346 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.009401 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.056309 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.209221 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.209345 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.254551 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.400049 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.400392 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.440387 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.601428 4747 generic.go:334] "Generic (PLEG): container finished" podID="93b006c1-39ff-4055-afbb-17046c6eed26" containerID="061ba7e8bed1d394dd31e3457bd5389c0730daa176521ac8d207e9640ccecfbd" exitCode=0 Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.601501 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"93b006c1-39ff-4055-afbb-17046c6eed26","Type":"ContainerDied","Data":"061ba7e8bed1d394dd31e3457bd5389c0730daa176521ac8d207e9640ccecfbd"} Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.609628 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.609903 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.639878 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.648350 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.648471 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.676087 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.745546 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.747135 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.764979 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.936640 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.936698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:31 crc kubenswrapper[4747]: I1205 20:45:31.936723 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.038178 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.038311 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.038338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.038380 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.038441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.069437 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access\") pod \"installer-9-crc\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.369534 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.607280 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:45:32 crc kubenswrapper[4747]: W1205 20:45:32.618161 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod44967e00_5c0f_4467_844e_d7b885ce2c98.slice/crio-ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76 WatchSource:0}: Error finding container ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76: Status 404 returned error can't find the container with id ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76 Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.672823 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.830084 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.954005 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access\") pod \"93b006c1-39ff-4055-afbb-17046c6eed26\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.954145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir\") pod \"93b006c1-39ff-4055-afbb-17046c6eed26\" (UID: \"93b006c1-39ff-4055-afbb-17046c6eed26\") " Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.954305 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "93b006c1-39ff-4055-afbb-17046c6eed26" (UID: "93b006c1-39ff-4055-afbb-17046c6eed26"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:45:32 crc kubenswrapper[4747]: I1205 20:45:32.958864 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "93b006c1-39ff-4055-afbb-17046c6eed26" (UID: "93b006c1-39ff-4055-afbb-17046c6eed26"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.056879 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93b006c1-39ff-4055-afbb-17046c6eed26-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.057043 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93b006c1-39ff-4055-afbb-17046c6eed26-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.447975 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.448027 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.491014 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.627627 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"93b006c1-39ff-4055-afbb-17046c6eed26","Type":"ContainerDied","Data":"c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59"} Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.627678 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59903dc3db5978f42f591cfb1e92783d225b753239087d3bf60417c17c0bc59" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.628485 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.629364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"44967e00-5c0f-4467-844e-d7b885ce2c98","Type":"ContainerStarted","Data":"3acf9f6133cfee9471c6f6bbb23b7418cd229d8a6f4f5e84ccf66060c8d7a803"} Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.629488 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"44967e00-5c0f-4467-844e-d7b885ce2c98","Type":"ContainerStarted","Data":"ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76"} Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.675157 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:33 crc kubenswrapper[4747]: I1205 20:45:33.693278 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.693255424 podStartE2EDuration="2.693255424s" podCreationTimestamp="2025-12-05 20:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:45:33.662887239 +0000 UTC m=+204.130194757" watchObservedRunningTime="2025-12-05 20:45:33.693255424 +0000 UTC m=+204.160562922" Dec 05 20:45:34 crc kubenswrapper[4747]: I1205 20:45:34.284248 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:45:34 crc kubenswrapper[4747]: I1205 20:45:34.329066 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:45:34 crc kubenswrapper[4747]: I1205 20:45:34.671726 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:34 crc kubenswrapper[4747]: I1205 20:45:34.724947 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:34 crc kubenswrapper[4747]: I1205 20:45:34.982390 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:45:35 crc kubenswrapper[4747]: I1205 20:45:35.645998 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zfqc7" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="registry-server" containerID="cri-o://f3efdbfa7d1666b063e1a1a8328f4551a85604797b4003672be140eeda4f1e7c" gracePeriod=2 Dec 05 20:45:35 crc kubenswrapper[4747]: I1205 20:45:35.976758 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:45:35 crc kubenswrapper[4747]: I1205 20:45:35.977018 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-szprt" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="registry-server" containerID="cri-o://a32e10379bf17b5ee06e42fd21d60f03d0b4e6065a1023bb745b75c04da1e97c" gracePeriod=2 Dec 05 20:45:36 crc kubenswrapper[4747]: I1205 20:45:36.222068 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:45:36 crc kubenswrapper[4747]: I1205 20:45:36.222249 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:45:36 crc kubenswrapper[4747]: I1205 20:45:36.222343 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:45:36 crc kubenswrapper[4747]: I1205 20:45:36.224959 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:45:36 crc kubenswrapper[4747]: I1205 20:45:36.225165 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd" gracePeriod=600 Dec 05 20:45:37 crc kubenswrapper[4747]: I1205 20:45:37.379566 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:45:37 crc kubenswrapper[4747]: I1205 20:45:37.379949 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mhk4l" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="registry-server" containerID="cri-o://4bd30eff53dec573e91807cc1b41afaf6424af7bf55585ca5c05bffc40add6c3" gracePeriod=2 Dec 05 20:45:37 crc kubenswrapper[4747]: I1205 20:45:37.610269 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" podUID="ae41eabe-e336-4ef9-9f65-022996a62860" containerName="oauth-openshift" containerID="cri-o://570ccc07b8765b8c7c1d6a09c7cc3018a067e1e96a57cc4ab3572f68fa3ce784" gracePeriod=15 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.383771 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.385055 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jm5r9" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="registry-server" containerID="cri-o://d0b9a9aeb01e3a4ce65b821d2b51290abb1e9624e70ff173b0d7bbb99a49d7af" gracePeriod=2 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.672975 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.673316 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.678383 4747 generic.go:334] "Generic (PLEG): container finished" podID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerID="d0b9a9aeb01e3a4ce65b821d2b51290abb1e9624e70ff173b0d7bbb99a49d7af" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.678439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerDied","Data":"d0b9a9aeb01e3a4ce65b821d2b51290abb1e9624e70ff173b0d7bbb99a49d7af"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.680666 4747 generic.go:334] "Generic (PLEG): container finished" podID="ae41eabe-e336-4ef9-9f65-022996a62860" containerID="570ccc07b8765b8c7c1d6a09c7cc3018a067e1e96a57cc4ab3572f68fa3ce784" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.680704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" event={"ID":"ae41eabe-e336-4ef9-9f65-022996a62860","Type":"ContainerDied","Data":"570ccc07b8765b8c7c1d6a09c7cc3018a067e1e96a57cc4ab3572f68fa3ce784"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.684062 4747 generic.go:334] "Generic (PLEG): container finished" podID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerID="4bd30eff53dec573e91807cc1b41afaf6424af7bf55585ca5c05bffc40add6c3" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.684110 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerDied","Data":"4bd30eff53dec573e91807cc1b41afaf6424af7bf55585ca5c05bffc40add6c3"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.695181 4747 generic.go:334] "Generic (PLEG): container finished" podID="8441fe24-a43e-4034-816d-f78a91f89025" containerID="f3efdbfa7d1666b063e1a1a8328f4551a85604797b4003672be140eeda4f1e7c" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.695260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerDied","Data":"f3efdbfa7d1666b063e1a1a8328f4551a85604797b4003672be140eeda4f1e7c"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.711858 4747 generic.go:334] "Generic (PLEG): container finished" podID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerID="a32e10379bf17b5ee06e42fd21d60f03d0b4e6065a1023bb745b75c04da1e97c" exitCode=0 Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.711940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerDied","Data":"a32e10379bf17b5ee06e42fd21d60f03d0b4e6065a1023bb745b75c04da1e97c"} Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.871136 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.946445 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities\") pod \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.946509 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content\") pod \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.946557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rrwt\" (UniqueName: \"kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt\") pod \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\" (UID: \"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b\") " Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.947167 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities" (OuterVolumeSpecName: "utilities") pod "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" (UID: "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.952142 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt" (OuterVolumeSpecName: "kube-api-access-8rrwt") pod "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" (UID: "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b"). InnerVolumeSpecName "kube-api-access-8rrwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:38 crc kubenswrapper[4747]: I1205 20:45:38.974464 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" (UID: "a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.015051 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.057122 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.057164 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rrwt\" (UniqueName: \"kubernetes.io/projected/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-kube-api-access-8rrwt\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.057185 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.088571 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.157719 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158173 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158208 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158268 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158297 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content\") pod \"2c3df409-857b-4fcb-afcc-fd27e65990f8\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158322 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158347 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities\") pod \"2c3df409-857b-4fcb-afcc-fd27e65990f8\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158400 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rdj\" (UniqueName: \"kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158433 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158478 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158509 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158542 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45qz\" (UniqueName: \"kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz\") pod \"2c3df409-857b-4fcb-afcc-fd27e65990f8\" (UID: \"2c3df409-857b-4fcb-afcc-fd27e65990f8\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158621 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.158646 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca\") pod \"ae41eabe-e336-4ef9-9f65-022996a62860\" (UID: \"ae41eabe-e336-4ef9-9f65-022996a62860\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.159569 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.160896 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.161860 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.162947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.163271 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities" (OuterVolumeSpecName: "utilities") pod "2c3df409-857b-4fcb-afcc-fd27e65990f8" (UID: "2c3df409-857b-4fcb-afcc-fd27e65990f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.163462 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.163925 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.164742 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz" (OuterVolumeSpecName: "kube-api-access-s45qz") pod "2c3df409-857b-4fcb-afcc-fd27e65990f8" (UID: "2c3df409-857b-4fcb-afcc-fd27e65990f8"). InnerVolumeSpecName "kube-api-access-s45qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.165228 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.166772 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.167471 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.168557 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.168870 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj" (OuterVolumeSpecName: "kube-api-access-28rdj") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "kube-api-access-28rdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.169819 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.170996 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.172272 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ae41eabe-e336-4ef9-9f65-022996a62860" (UID: "ae41eabe-e336-4ef9-9f65-022996a62860"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.203296 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.238478 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c3df409-857b-4fcb-afcc-fd27e65990f8" (UID: "2c3df409-857b-4fcb-afcc-fd27e65990f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260219 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content\") pod \"6fd788fb-fa63-485f-bcfa-be929ef73305\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities\") pod \"6fd788fb-fa63-485f-bcfa-be929ef73305\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjt5h\" (UniqueName: \"kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h\") pod \"6fd788fb-fa63-485f-bcfa-be929ef73305\" (UID: \"6fd788fb-fa63-485f-bcfa-be929ef73305\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260565 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260610 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260627 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260642 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28rdj\" (UniqueName: \"kubernetes.io/projected/ae41eabe-e336-4ef9-9f65-022996a62860-kube-api-access-28rdj\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260653 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260664 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260674 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260683 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260695 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s45qz\" (UniqueName: \"kubernetes.io/projected/2c3df409-857b-4fcb-afcc-fd27e65990f8-kube-api-access-s45qz\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260704 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260712 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260720 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260731 4747 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae41eabe-e336-4ef9-9f65-022996a62860-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260740 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260749 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260759 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae41eabe-e336-4ef9-9f65-022996a62860-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.260768 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3df409-857b-4fcb-afcc-fd27e65990f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.261324 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities" (OuterVolumeSpecName: "utilities") pod "6fd788fb-fa63-485f-bcfa-be929ef73305" (UID: "6fd788fb-fa63-485f-bcfa-be929ef73305"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.263850 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h" (OuterVolumeSpecName: "kube-api-access-rjt5h") pod "6fd788fb-fa63-485f-bcfa-be929ef73305" (UID: "6fd788fb-fa63-485f-bcfa-be929ef73305"). InnerVolumeSpecName "kube-api-access-rjt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.362165 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.362250 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjt5h\" (UniqueName: \"kubernetes.io/projected/6fd788fb-fa63-485f-bcfa-be929ef73305-kube-api-access-rjt5h\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.362726 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd788fb-fa63-485f-bcfa-be929ef73305" (UID: "6fd788fb-fa63-485f-bcfa-be929ef73305"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.463404 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd788fb-fa63-485f-bcfa-be929ef73305-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.693899 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.727159 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhk4l" event={"ID":"a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b","Type":"ContainerDied","Data":"640ec5944e59587087fdd6351b44b440597e1e37288e0924ed24a8b0d5b5dbf4"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.727214 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhk4l" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.727228 4747 scope.go:117] "RemoveContainer" containerID="4bd30eff53dec573e91807cc1b41afaf6424af7bf55585ca5c05bffc40add6c3" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.729863 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfqc7" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.729866 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfqc7" event={"ID":"8441fe24-a43e-4034-816d-f78a91f89025","Type":"ContainerDied","Data":"a0db2ab60287c8eb37c2e4d08f55e1ff620c53cfd6a64cf9e82909f2cae5f406"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.736428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szprt" event={"ID":"2c3df409-857b-4fcb-afcc-fd27e65990f8","Type":"ContainerDied","Data":"5ed0d4b93bb9e47f01e3b9c0be692832a8c8464ab92e0bd44d3e9a27d5e2fc50"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.736558 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szprt" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.740736 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.745206 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jm5r9" event={"ID":"6fd788fb-fa63-485f-bcfa-be929ef73305","Type":"ContainerDied","Data":"024c7cea0e013199a941cda90192b2bee9a7d16ee1c84f211ea1b319b8cca95d"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.745225 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jm5r9" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.747652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" event={"ID":"ae41eabe-e336-4ef9-9f65-022996a62860","Type":"ContainerDied","Data":"83760a4ac4ab3954ec9a2b30c57147ed599cbd9137833e14dddc5ae30470a320"} Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.747727 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qt2mg" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.753715 4747 scope.go:117] "RemoveContainer" containerID="e6161bd3b0d65f9ed856307303873a06cf97266ac7450738b351cd615ad95237" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.766555 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content\") pod \"8441fe24-a43e-4034-816d-f78a91f89025\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.766616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnk5x\" (UniqueName: \"kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x\") pod \"8441fe24-a43e-4034-816d-f78a91f89025\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.766713 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities\") pod \"8441fe24-a43e-4034-816d-f78a91f89025\" (UID: \"8441fe24-a43e-4034-816d-f78a91f89025\") " Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.771568 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities" (OuterVolumeSpecName: "utilities") pod "8441fe24-a43e-4034-816d-f78a91f89025" (UID: "8441fe24-a43e-4034-816d-f78a91f89025"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.776042 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.776087 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x" (OuterVolumeSpecName: "kube-api-access-mnk5x") pod "8441fe24-a43e-4034-816d-f78a91f89025" (UID: "8441fe24-a43e-4034-816d-f78a91f89025"). InnerVolumeSpecName "kube-api-access-mnk5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.778997 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhk4l"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.817113 4747 scope.go:117] "RemoveContainer" containerID="935feb09f4e788f3f1205ba09819cb732b5a0d7cdd53e83c62d11a753587aa0e" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.821055 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8441fe24-a43e-4034-816d-f78a91f89025" (UID: "8441fe24-a43e-4034-816d-f78a91f89025"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.843112 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.846102 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jm5r9"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.854531 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" path="/var/lib/kubelet/pods/6fd788fb-fa63-485f-bcfa-be929ef73305/volumes" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.855161 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" path="/var/lib/kubelet/pods/a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b/volumes" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.861012 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.861374 4747 scope.go:117] "RemoveContainer" containerID="f3efdbfa7d1666b063e1a1a8328f4551a85604797b4003672be140eeda4f1e7c" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.865605 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-szprt"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.868237 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.868320 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnk5x\" (UniqueName: \"kubernetes.io/projected/8441fe24-a43e-4034-816d-f78a91f89025-kube-api-access-mnk5x\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.868412 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8441fe24-a43e-4034-816d-f78a91f89025-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.880840 4747 scope.go:117] "RemoveContainer" containerID="97caefb62daef8167e2b3b8326305449dbca1360d7a2df6d967333eec408aa06" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.880947 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.883445 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qt2mg"] Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.894697 4747 scope.go:117] "RemoveContainer" containerID="7f5f1bf1f8907425d535b7fd83026f65aa3497e366a2ee5cc3447e0d0f62a55c" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.943203 4747 scope.go:117] "RemoveContainer" containerID="a32e10379bf17b5ee06e42fd21d60f03d0b4e6065a1023bb745b75c04da1e97c" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.959875 4747 scope.go:117] "RemoveContainer" containerID="619a15383361c33b9c5b4c688c55aa9d303ae03a7daed889a1704305c1b4c2e8" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.978983 4747 scope.go:117] "RemoveContainer" containerID="5d5af78e50cca733543bb264d72af56a61d3b100a52adcc8da8f4f1ef1515a6b" Dec 05 20:45:39 crc kubenswrapper[4747]: I1205 20:45:39.995350 4747 scope.go:117] "RemoveContainer" containerID="d0b9a9aeb01e3a4ce65b821d2b51290abb1e9624e70ff173b0d7bbb99a49d7af" Dec 05 20:45:40 crc kubenswrapper[4747]: I1205 20:45:40.012026 4747 scope.go:117] "RemoveContainer" containerID="aa0f45c405df6a398978b1b872495a49d66a660df9d5ca96e2e2daad58b37442" Dec 05 20:45:40 crc kubenswrapper[4747]: I1205 20:45:40.031575 4747 scope.go:117] "RemoveContainer" containerID="e57ca75d80c933984a6a5c8cec60e58a98050bc66c3f4441a769bb51ccf9f88e" Dec 05 20:45:40 crc kubenswrapper[4747]: I1205 20:45:40.049835 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:45:40 crc kubenswrapper[4747]: I1205 20:45:40.051775 4747 scope.go:117] "RemoveContainer" containerID="570ccc07b8765b8c7c1d6a09c7cc3018a067e1e96a57cc4ab3572f68fa3ce784" Dec 05 20:45:40 crc kubenswrapper[4747]: I1205 20:45:40.054138 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zfqc7"] Dec 05 20:45:41 crc kubenswrapper[4747]: I1205 20:45:41.852011 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" path="/var/lib/kubelet/pods/2c3df409-857b-4fcb-afcc-fd27e65990f8/volumes" Dec 05 20:45:41 crc kubenswrapper[4747]: I1205 20:45:41.855973 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8441fe24-a43e-4034-816d-f78a91f89025" path="/var/lib/kubelet/pods/8441fe24-a43e-4034-816d-f78a91f89025/volumes" Dec 05 20:45:41 crc kubenswrapper[4747]: I1205 20:45:41.857637 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae41eabe-e336-4ef9-9f65-022996a62860" path="/var/lib/kubelet/pods/ae41eabe-e336-4ef9-9f65-022996a62860/volumes" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.816375 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-4h4x7"] Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817036 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817057 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817079 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817092 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817114 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817127 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817144 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817156 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817176 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817188 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817206 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817218 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817242 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817259 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b006c1-39ff-4055-afbb-17046c6eed26" containerName="pruner" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817271 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b006c1-39ff-4055-afbb-17046c6eed26" containerName="pruner" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817284 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817298 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817316 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817328 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="extract-utilities" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817347 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae41eabe-e336-4ef9-9f65-022996a62860" containerName="oauth-openshift" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817360 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae41eabe-e336-4ef9-9f65-022996a62860" containerName="oauth-openshift" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817379 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817391 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817410 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817421 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: E1205 20:45:47.817435 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817450 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="extract-content" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817689 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3df409-857b-4fcb-afcc-fd27e65990f8" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817706 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae41eabe-e336-4ef9-9f65-022996a62860" containerName="oauth-openshift" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817727 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd788fb-fa63-485f-bcfa-be929ef73305" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817747 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ae41a7-9df3-4aeb-9ffe-b8bfb3bb413b" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817766 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8441fe24-a43e-4034-816d-f78a91f89025" containerName="registry-server" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.817785 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b006c1-39ff-4055-afbb-17046c6eed26" containerName="pruner" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.818355 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.821946 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.822299 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.826340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.826845 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.827115 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.827998 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.828278 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.828453 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.828863 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.829091 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.829146 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.830173 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.841267 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.843048 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.851438 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-4h4x7"] Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879599 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879703 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879764 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41331068-7afc-4067-915c-59080b9ea3ad-audit-dir\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879764 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879931 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879963 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-audit-policies\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.879998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.880033 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.880083 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.880162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.881651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qb8\" (UniqueName: \"kubernetes.io/projected/41331068-7afc-4067-915c-59080b9ea3ad-kube-api-access-52qb8\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.881805 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.881896 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983254 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983342 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983389 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983415 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-audit-policies\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983438 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983459 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983532 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qb8\" (UniqueName: \"kubernetes.io/projected/41331068-7afc-4067-915c-59080b9ea3ad-kube-api-access-52qb8\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983594 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983643 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983677 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983707 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41331068-7afc-4067-915c-59080b9ea3ad-audit-dir\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.983793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41331068-7afc-4067-915c-59080b9ea3ad-audit-dir\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.984974 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.985087 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-audit-policies\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.985654 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.985726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.992359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.992430 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.993136 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.994443 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.995179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:47 crc kubenswrapper[4747]: I1205 20:45:47.999442 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.001045 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.001263 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41331068-7afc-4067-915c-59080b9ea3ad-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.013557 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qb8\" (UniqueName: \"kubernetes.io/projected/41331068-7afc-4067-915c-59080b9ea3ad-kube-api-access-52qb8\") pod \"oauth-openshift-5db964fdbd-4h4x7\" (UID: \"41331068-7afc-4067-915c-59080b9ea3ad\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.148926 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.587542 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-4h4x7"] Dec 05 20:45:48 crc kubenswrapper[4747]: W1205 20:45:48.597049 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41331068_7afc_4067_915c_59080b9ea3ad.slice/crio-a33d767fd1b73945a60d5c3812dd62b0dc8c0993ab34545a3f6f8579fd19b47f WatchSource:0}: Error finding container a33d767fd1b73945a60d5c3812dd62b0dc8c0993ab34545a3f6f8579fd19b47f: Status 404 returned error can't find the container with id a33d767fd1b73945a60d5c3812dd62b0dc8c0993ab34545a3f6f8579fd19b47f Dec 05 20:45:48 crc kubenswrapper[4747]: I1205 20:45:48.811666 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" event={"ID":"41331068-7afc-4067-915c-59080b9ea3ad","Type":"ContainerStarted","Data":"a33d767fd1b73945a60d5c3812dd62b0dc8c0993ab34545a3f6f8579fd19b47f"} Dec 05 20:45:49 crc kubenswrapper[4747]: I1205 20:45:49.819247 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" event={"ID":"41331068-7afc-4067-915c-59080b9ea3ad","Type":"ContainerStarted","Data":"475ee4aebc540180aba9eb51dccc6b20cd5500d7e68f11095c4616a27339e7ab"} Dec 05 20:45:49 crc kubenswrapper[4747]: I1205 20:45:49.819705 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:49 crc kubenswrapper[4747]: I1205 20:45:49.833268 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" Dec 05 20:45:49 crc kubenswrapper[4747]: I1205 20:45:49.850636 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5db964fdbd-4h4x7" podStartSLOduration=37.850612171 podStartE2EDuration="37.850612171s" podCreationTimestamp="2025-12-05 20:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:45:49.848752001 +0000 UTC m=+220.316059539" watchObservedRunningTime="2025-12-05 20:45:49.850612171 +0000 UTC m=+220.317919669" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.187423 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.189660 4747 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.189846 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.190257 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220" gracePeriod=15 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.190413 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a" gracePeriod=15 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.190388 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9" gracePeriod=15 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.190462 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f" gracePeriod=15 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.190746 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f" gracePeriod=15 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.193570 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194049 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194080 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194108 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194125 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194153 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194170 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194194 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194210 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194232 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194250 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194292 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194310 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.194337 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194356 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194641 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194679 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194706 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194726 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194746 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.194762 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.234434 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.245907 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246015 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246101 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246127 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246179 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.246200 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347275 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347464 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347486 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347538 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347684 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347717 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347745 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347773 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347800 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347857 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.347888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.536450 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:46:11 crc kubenswrapper[4747]: E1205 20:46:11.567131 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6ca7b736bbd9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:46:11.564968921 +0000 UTC m=+242.032276419,LastTimestamp:2025-12-05 20:46:11.564968921 +0000 UTC m=+242.032276419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.952335 4747 generic.go:334] "Generic (PLEG): container finished" podID="44967e00-5c0f-4467-844e-d7b885ce2c98" containerID="3acf9f6133cfee9471c6f6bbb23b7418cd229d8a6f4f5e84ccf66060c8d7a803" exitCode=0 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.952453 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"44967e00-5c0f-4467-844e-d7b885ce2c98","Type":"ContainerDied","Data":"3acf9f6133cfee9471c6f6bbb23b7418cd229d8a6f4f5e84ccf66060c8d7a803"} Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.953358 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.953716 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.955074 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.956641 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.957466 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f" exitCode=0 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.957482 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a" exitCode=0 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.957491 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9" exitCode=0 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.957500 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f" exitCode=2 Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.957542 4747 scope.go:117] "RemoveContainer" containerID="303f6cb96518e0f0d25a217d3cf3d24997abd62f28a7f95bf15678f1da380a36" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.959883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34"} Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.959910 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1cb67ba7e254cb98cf0476657c164c1d9ad550e495f9836980c85214121f6f53"} Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.960813 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:11 crc kubenswrapper[4747]: I1205 20:46:11.961270 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:12 crc kubenswrapper[4747]: I1205 20:46:12.970014 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.236323 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.237202 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.237572 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.269293 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir\") pod \"44967e00-5c0f-4467-844e-d7b885ce2c98\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.269345 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access\") pod \"44967e00-5c0f-4467-844e-d7b885ce2c98\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.269395 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock\") pod \"44967e00-5c0f-4467-844e-d7b885ce2c98\" (UID: \"44967e00-5c0f-4467-844e-d7b885ce2c98\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.269669 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "44967e00-5c0f-4467-844e-d7b885ce2c98" (UID: "44967e00-5c0f-4467-844e-d7b885ce2c98"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.269687 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock" (OuterVolumeSpecName: "var-lock") pod "44967e00-5c0f-4467-844e-d7b885ce2c98" (UID: "44967e00-5c0f-4467-844e-d7b885ce2c98"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.297423 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "44967e00-5c0f-4467-844e-d7b885ce2c98" (UID: "44967e00-5c0f-4467-844e-d7b885ce2c98"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.371078 4747 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.371359 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44967e00-5c0f-4467-844e-d7b885ce2c98-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.371451 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44967e00-5c0f-4467-844e-d7b885ce2c98-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.585213 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.586064 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.587023 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.587529 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.588246 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.673898 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674046 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674118 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674294 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674340 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674438 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.674493 4747 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.777108 4747 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.777159 4747 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.851791 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.983429 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.983408 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"44967e00-5c0f-4467-844e-d7b885ce2c98","Type":"ContainerDied","Data":"ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76"} Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.984784 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ded6922d6213d7302478f4c02a8fb7bdf34d64b612f4ff36640379b3e0e5eb76" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.989143 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.990175 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220" exitCode=0 Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.990251 4747 scope.go:117] "RemoveContainer" containerID="d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.990460 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.990915 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.991888 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.992625 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.993209 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.993968 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.994571 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.995062 4747 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:13 crc kubenswrapper[4747]: I1205 20:46:13.995548 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.013053 4747 scope.go:117] "RemoveContainer" containerID="e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.034151 4747 scope.go:117] "RemoveContainer" containerID="93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.047831 4747 scope.go:117] "RemoveContainer" containerID="4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.065186 4747 scope.go:117] "RemoveContainer" containerID="7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.084773 4747 scope.go:117] "RemoveContainer" containerID="36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.104472 4747 scope.go:117] "RemoveContainer" containerID="d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.105432 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\": container with ID starting with d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f not found: ID does not exist" containerID="d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.105475 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f"} err="failed to get container status \"d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\": rpc error: code = NotFound desc = could not find container \"d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f\": container with ID starting with d543bd73d3cd4e3c90822a8542442c5122e58395bdc075e3a893917612ebd15f not found: ID does not exist" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.105506 4747 scope.go:117] "RemoveContainer" containerID="e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.105935 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\": container with ID starting with e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a not found: ID does not exist" containerID="e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.105965 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a"} err="failed to get container status \"e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\": rpc error: code = NotFound desc = could not find container \"e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a\": container with ID starting with e2c7db3cc34bb2c7bb1c1b510245333afa6360a2cb8fc8cf1e8ecacad5fadf4a not found: ID does not exist" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.105983 4747 scope.go:117] "RemoveContainer" containerID="93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.106282 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\": container with ID starting with 93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9 not found: ID does not exist" containerID="93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.106316 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9"} err="failed to get container status \"93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\": rpc error: code = NotFound desc = could not find container \"93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9\": container with ID starting with 93ba334060a02dc31cfd6730fd9a62e05f246fe0d71da7370d8261244a5f78a9 not found: ID does not exist" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.106335 4747 scope.go:117] "RemoveContainer" containerID="4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.106782 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\": container with ID starting with 4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f not found: ID does not exist" containerID="4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.106893 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f"} err="failed to get container status \"4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\": rpc error: code = NotFound desc = could not find container \"4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f\": container with ID starting with 4bbb73024aad13e19ba671d4d62f4a85c3a2ece9ae3a716ed5b7319678f2c56f not found: ID does not exist" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.106979 4747 scope.go:117] "RemoveContainer" containerID="7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.107520 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\": container with ID starting with 7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220 not found: ID does not exist" containerID="7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.107556 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220"} err="failed to get container status \"7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\": rpc error: code = NotFound desc = could not find container \"7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220\": container with ID starting with 7b9798c7a22872a5039c70bfcc81fc93a2e8d54723bea5b631d278640e554220 not found: ID does not exist" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.107591 4747 scope.go:117] "RemoveContainer" containerID="36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56" Dec 05 20:46:14 crc kubenswrapper[4747]: E1205 20:46:14.107892 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\": container with ID starting with 36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56 not found: ID does not exist" containerID="36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56" Dec 05 20:46:14 crc kubenswrapper[4747]: I1205 20:46:14.107916 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56"} err="failed to get container status \"36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\": rpc error: code = NotFound desc = could not find container \"36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56\": container with ID starting with 36891c77508176637034203b0b45e642a419e05690bdcd783518e8ee64a8bf56 not found: ID does not exist" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.250488 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.252441 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.253209 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.253632 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.254106 4747 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:15 crc kubenswrapper[4747]: I1205 20:46:15.254161 4747 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.254451 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.455473 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Dec 05 20:46:15 crc kubenswrapper[4747]: E1205 20:46:15.856536 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Dec 05 20:46:16 crc kubenswrapper[4747]: E1205 20:46:16.658233 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Dec 05 20:46:17 crc kubenswrapper[4747]: E1205 20:46:17.676421 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6ca7b736bbd9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:46:11.564968921 +0000 UTC m=+242.032276419,LastTimestamp:2025-12-05 20:46:11.564968921 +0000 UTC m=+242.032276419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:46:18 crc kubenswrapper[4747]: E1205 20:46:18.259992 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Dec 05 20:46:19 crc kubenswrapper[4747]: I1205 20:46:19.842663 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:19 crc kubenswrapper[4747]: I1205 20:46:19.843432 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:21 crc kubenswrapper[4747]: E1205 20:46:21.460741 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.839850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.841303 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.841912 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.856013 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.856046 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:24 crc kubenswrapper[4747]: E1205 20:46:24.856493 4747 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:24 crc kubenswrapper[4747]: I1205 20:46:24.856947 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.088273 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.088353 4747 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54" exitCode=1 Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.088442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54"} Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.089358 4747 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.089523 4747 scope.go:117] "RemoveContainer" containerID="c2a592fd1387e17e84658abf3e7af592fff735413c3dbd2f33d62357cb045f54" Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.089820 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.089912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b2efaca2287867e90e98fcd03a8844130d8c81d4af8bb754d1ae3c286699c988"} Dec 05 20:46:25 crc kubenswrapper[4747]: I1205 20:46:25.090370 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.099743 4747 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="663e4aaf2a2f400c22edbb54242b4443f9c8acb6eb80a9c8e1a2001e7d00c80a" exitCode=0 Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.099832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"663e4aaf2a2f400c22edbb54242b4443f9c8acb6eb80a9c8e1a2001e7d00c80a"} Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.100240 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.100280 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.100695 4747 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: E1205 20:46:26.100908 4747 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.100978 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.101230 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.103947 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.104026 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3f38a87351ad7b737f01c6abfe06fd1a4cc88ca7729ff104e15e82f0ad54e28"} Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.104756 4747 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.105154 4747 status_manager.go:851] "Failed to get status for pod" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:26 crc kubenswrapper[4747]: I1205 20:46:26.105642 4747 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Dec 05 20:46:27 crc kubenswrapper[4747]: I1205 20:46:27.115064 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b7c3a7641e01878129414f47f219611daf5857ab32156ad6d58fbe6ea82c248f"} Dec 05 20:46:27 crc kubenswrapper[4747]: I1205 20:46:27.115128 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f82d596c5788086b3b62a84f603275fb3d51f9e60d4b8670fb776e1e56dbe0a2"} Dec 05 20:46:27 crc kubenswrapper[4747]: I1205 20:46:27.115139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91568493e2ea605b1b69dc313bf1ab3d9b39dc828f7e46848dc4818f97d3c30c"} Dec 05 20:46:27 crc kubenswrapper[4747]: I1205 20:46:27.115148 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bd7f7f0cc9081d17d63598762af86010d974fef0c891316b78209d89949b7f8"} Dec 05 20:46:28 crc kubenswrapper[4747]: I1205 20:46:28.125775 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5132555d04958b610585eaeab02fcce3ccc83941992ed14391a968b81ffa305c"} Dec 05 20:46:28 crc kubenswrapper[4747]: I1205 20:46:28.126202 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:28 crc kubenswrapper[4747]: I1205 20:46:28.126236 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:28 crc kubenswrapper[4747]: I1205 20:46:28.126277 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:29 crc kubenswrapper[4747]: I1205 20:46:29.857627 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:29 crc kubenswrapper[4747]: I1205 20:46:29.857895 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:29 crc kubenswrapper[4747]: I1205 20:46:29.863877 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:31 crc kubenswrapper[4747]: I1205 20:46:31.420460 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:46:31 crc kubenswrapper[4747]: I1205 20:46:31.427059 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:46:32 crc kubenswrapper[4747]: I1205 20:46:32.152270 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:46:33 crc kubenswrapper[4747]: I1205 20:46:33.135418 4747 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:33 crc kubenswrapper[4747]: I1205 20:46:33.161099 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:33 crc kubenswrapper[4747]: I1205 20:46:33.161127 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:33 crc kubenswrapper[4747]: I1205 20:46:33.166087 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:33 crc kubenswrapper[4747]: I1205 20:46:33.224659 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7185c44d-dcf5-42c8-95a6-8fe9557c2fbe" Dec 05 20:46:34 crc kubenswrapper[4747]: I1205 20:46:34.166127 4747 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:34 crc kubenswrapper[4747]: I1205 20:46:34.167332 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaa31191-71db-4021-96c8-0080ef901122" Dec 05 20:46:34 crc kubenswrapper[4747]: I1205 20:46:34.168230 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7185c44d-dcf5-42c8-95a6-8fe9557c2fbe" Dec 05 20:46:42 crc kubenswrapper[4747]: I1205 20:46:42.678553 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:46:43 crc kubenswrapper[4747]: I1205 20:46:43.208277 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:46:43 crc kubenswrapper[4747]: I1205 20:46:43.295919 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:46:43 crc kubenswrapper[4747]: I1205 20:46:43.378179 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:46:43 crc kubenswrapper[4747]: I1205 20:46:43.783872 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.114625 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.117465 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.228823 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.249802 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.466662 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.502689 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.561053 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.909552 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.928191 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:46:44 crc kubenswrapper[4747]: I1205 20:46:44.937885 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.055704 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.057524 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.144488 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.294387 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.394082 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.517725 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.584786 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.777906 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.868029 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.894960 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.949599 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:46:45 crc kubenswrapper[4747]: I1205 20:46:45.977306 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.099795 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.228973 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.268887 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.297286 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.306496 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.363467 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.466169 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.485463 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.513272 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.557679 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.580356 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.582755 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.599188 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.613448 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.647488 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.661032 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.705355 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.715133 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.752380 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.909496 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:46:46 crc kubenswrapper[4747]: I1205 20:46:46.932449 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.069400 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.167108 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.205754 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.295349 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.304877 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.439357 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.504052 4747 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.511315 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.511291622 podStartE2EDuration="36.511291622s" podCreationTimestamp="2025-12-05 20:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:46:33.191757943 +0000 UTC m=+263.659065431" watchObservedRunningTime="2025-12-05 20:46:47.511291622 +0000 UTC m=+277.978599150" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.512909 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.513025 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.516739 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.540860 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.557281 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.558500 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.558480584 podStartE2EDuration="14.558480584s" podCreationTimestamp="2025-12-05 20:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:46:47.537002059 +0000 UTC m=+278.004309587" watchObservedRunningTime="2025-12-05 20:46:47.558480584 +0000 UTC m=+278.025788072" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.573571 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.583872 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.693262 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.743162 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.811348 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.830461 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.894270 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:46:47 crc kubenswrapper[4747]: I1205 20:46:47.926290 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.085068 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.097011 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.104898 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.112902 4747 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.276974 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.334840 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.391727 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.473718 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.502003 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.606781 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.617618 4747 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.650656 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.707786 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.733931 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.855856 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.875442 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.885359 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.942852 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.948687 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:46:48 crc kubenswrapper[4747]: I1205 20:46:48.989258 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.100078 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.145824 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.235722 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.249495 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.264658 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.294838 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.406834 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.439022 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.462413 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.510861 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.615892 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.651255 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.826043 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.911094 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.978105 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:46:49 crc kubenswrapper[4747]: I1205 20:46:49.993809 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.023206 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.154235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.248090 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.265736 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.449795 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.465265 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.712091 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.761439 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.793758 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.805610 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.814735 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.817792 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.827937 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.846470 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.903871 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:46:50 crc kubenswrapper[4747]: I1205 20:46:50.963708 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.023611 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.064354 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.107040 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.151939 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.262368 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.276832 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.280285 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.357408 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.383056 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.391996 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.516488 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.534061 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.631970 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.644391 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.662878 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.738436 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.754967 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.820139 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.876681 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.923262 4747 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:46:51 crc kubenswrapper[4747]: I1205 20:46:51.995239 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.002232 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.069037 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.151759 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.191979 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.216617 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.247720 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.303361 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.331074 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.370977 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.399972 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.418299 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.468645 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.530477 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.531493 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.889571 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.907535 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.930145 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.966416 4747 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:46:52 crc kubenswrapper[4747]: I1205 20:46:52.969866 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.033911 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.063606 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.095023 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.106650 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.111550 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.127263 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.210018 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.275329 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.401174 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.438773 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.475947 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.505902 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.545862 4747 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.547750 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.602404 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.660569 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.698996 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.807728 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.824108 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.879329 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.935358 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:46:53 crc kubenswrapper[4747]: I1205 20:46:53.946251 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.040379 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.059389 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.126035 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.126487 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.148839 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.173952 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.179904 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.234308 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.236461 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.261978 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.262234 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8hg9x" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="registry-server" containerID="cri-o://3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9" gracePeriod=30 Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.268529 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.268971 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6p67" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="registry-server" containerID="cri-o://45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9" gracePeriod=30 Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.290172 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.290433 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" containerID="cri-o://5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9" gracePeriod=30 Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.293977 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.294333 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q278l" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="registry-server" containerID="cri-o://ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95" gracePeriod=30 Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.296555 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.296808 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g28vx" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="registry-server" containerID="cri-o://942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834" gracePeriod=30 Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.310256 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4nzv"] Dec 05 20:46:54 crc kubenswrapper[4747]: E1205 20:46:54.310775 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" containerName="installer" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.310884 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" containerName="installer" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.311112 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="44967e00-5c0f-4467-844e-d7b885ce2c98" containerName="installer" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.317241 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4nzv"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.317630 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.332216 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.348571 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79pvw\" (UniqueName: \"kubernetes.io/projected/6ce1891f-9919-4ec9-bf2f-9662d075b240-kube-api-access-79pvw\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.348672 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.348709 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.390789 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.425638 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.450993 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.451062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.451116 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79pvw\" (UniqueName: \"kubernetes.io/projected/6ce1891f-9919-4ec9-bf2f-9662d075b240-kube-api-access-79pvw\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.452783 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.459058 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ce1891f-9919-4ec9-bf2f-9662d075b240-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.481558 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79pvw\" (UniqueName: \"kubernetes.io/projected/6ce1891f-9919-4ec9-bf2f-9662d075b240-kube-api-access-79pvw\") pod \"marketplace-operator-79b997595-m4nzv\" (UID: \"6ce1891f-9919-4ec9-bf2f-9662d075b240\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.524448 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.579623 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.653143 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.676000 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.723159 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.725520 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.731029 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.739794 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754393 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754777 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glj77\" (UniqueName: \"kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77\") pod \"8a99c798-6879-43bb-817c-621364a56b5a\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754887 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content\") pod \"5efe7146-f4a8-42c3-84d9-b974a2618203\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754916 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbdkj\" (UniqueName: \"kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj\") pod \"c7a9c2fc-a6db-4def-9938-f1da651566c8\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754960 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics\") pod \"c7a9c2fc-a6db-4def-9938-f1da651566c8\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.754995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities\") pod \"8a99c798-6879-43bb-817c-621364a56b5a\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755017 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca\") pod \"c7a9c2fc-a6db-4def-9938-f1da651566c8\" (UID: \"c7a9c2fc-a6db-4def-9938-f1da651566c8\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755046 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities\") pod \"5efe7146-f4a8-42c3-84d9-b974a2618203\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities\") pod \"6dbcd101-ff16-4970-8433-37b2576e551b\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755119 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7blmr\" (UniqueName: \"kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr\") pod \"6dbcd101-ff16-4970-8433-37b2576e551b\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content\") pod \"8a99c798-6879-43bb-817c-621364a56b5a\" (UID: \"8a99c798-6879-43bb-817c-621364a56b5a\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755161 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbg5\" (UniqueName: \"kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5\") pod \"5efe7146-f4a8-42c3-84d9-b974a2618203\" (UID: \"5efe7146-f4a8-42c3-84d9-b974a2618203\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.755202 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content\") pod \"6dbcd101-ff16-4970-8433-37b2576e551b\" (UID: \"6dbcd101-ff16-4970-8433-37b2576e551b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.757048 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities" (OuterVolumeSpecName: "utilities") pod "5efe7146-f4a8-42c3-84d9-b974a2618203" (UID: "5efe7146-f4a8-42c3-84d9-b974a2618203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.757709 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c7a9c2fc-a6db-4def-9938-f1da651566c8" (UID: "c7a9c2fc-a6db-4def-9938-f1da651566c8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.757941 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities" (OuterVolumeSpecName: "utilities") pod "6dbcd101-ff16-4970-8433-37b2576e551b" (UID: "6dbcd101-ff16-4970-8433-37b2576e551b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.757963 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities" (OuterVolumeSpecName: "utilities") pod "8a99c798-6879-43bb-817c-621364a56b5a" (UID: "8a99c798-6879-43bb-817c-621364a56b5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.759763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj" (OuterVolumeSpecName: "kube-api-access-tbdkj") pod "c7a9c2fc-a6db-4def-9938-f1da651566c8" (UID: "c7a9c2fc-a6db-4def-9938-f1da651566c8"). InnerVolumeSpecName "kube-api-access-tbdkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.760593 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr" (OuterVolumeSpecName: "kube-api-access-7blmr") pod "6dbcd101-ff16-4970-8433-37b2576e551b" (UID: "6dbcd101-ff16-4970-8433-37b2576e551b"). InnerVolumeSpecName "kube-api-access-7blmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.761177 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c7a9c2fc-a6db-4def-9938-f1da651566c8" (UID: "c7a9c2fc-a6db-4def-9938-f1da651566c8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.771597 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5" (OuterVolumeSpecName: "kube-api-access-fzbg5") pod "5efe7146-f4a8-42c3-84d9-b974a2618203" (UID: "5efe7146-f4a8-42c3-84d9-b974a2618203"). InnerVolumeSpecName "kube-api-access-fzbg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.794450 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.801030 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77" (OuterVolumeSpecName: "kube-api-access-glj77") pod "8a99c798-6879-43bb-817c-621364a56b5a" (UID: "8a99c798-6879-43bb-817c-621364a56b5a"). InnerVolumeSpecName "kube-api-access-glj77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.829175 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dbcd101-ff16-4970-8433-37b2576e551b" (UID: "6dbcd101-ff16-4970-8433-37b2576e551b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.842779 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5efe7146-f4a8-42c3-84d9-b974a2618203" (UID: "5efe7146-f4a8-42c3-84d9-b974a2618203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856018 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content\") pod \"2fca7ed3-fb68-43bd-8186-7d73d673098b\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856087 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities\") pod \"2fca7ed3-fb68-43bd-8186-7d73d673098b\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856175 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn\") pod \"2fca7ed3-fb68-43bd-8186-7d73d673098b\" (UID: \"2fca7ed3-fb68-43bd-8186-7d73d673098b\") " Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856488 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856508 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856518 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856527 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856539 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7blmr\" (UniqueName: \"kubernetes.io/projected/6dbcd101-ff16-4970-8433-37b2576e551b-kube-api-access-7blmr\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856547 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbg5\" (UniqueName: \"kubernetes.io/projected/5efe7146-f4a8-42c3-84d9-b974a2618203-kube-api-access-fzbg5\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856555 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbcd101-ff16-4970-8433-37b2576e551b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856564 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glj77\" (UniqueName: \"kubernetes.io/projected/8a99c798-6879-43bb-817c-621364a56b5a-kube-api-access-glj77\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856572 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efe7146-f4a8-42c3-84d9-b974a2618203-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856594 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbdkj\" (UniqueName: \"kubernetes.io/projected/c7a9c2fc-a6db-4def-9938-f1da651566c8-kube-api-access-tbdkj\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.856604 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7a9c2fc-a6db-4def-9938-f1da651566c8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.861528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities" (OuterVolumeSpecName: "utilities") pod "2fca7ed3-fb68-43bd-8186-7d73d673098b" (UID: "2fca7ed3-fb68-43bd-8186-7d73d673098b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.866303 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn" (OuterVolumeSpecName: "kube-api-access-7rssn") pod "2fca7ed3-fb68-43bd-8186-7d73d673098b" (UID: "2fca7ed3-fb68-43bd-8186-7d73d673098b"). InnerVolumeSpecName "kube-api-access-7rssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.868859 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.886484 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fca7ed3-fb68-43bd-8186-7d73d673098b" (UID: "2fca7ed3-fb68-43bd-8186-7d73d673098b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.907027 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4nzv"] Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.945290 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a99c798-6879-43bb-817c-621364a56b5a" (UID: "8a99c798-6879-43bb-817c-621364a56b5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.957763 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.957792 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fca7ed3-fb68-43bd-8186-7d73d673098b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.957802 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a99c798-6879-43bb-817c-621364a56b5a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.957811 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/2fca7ed3-fb68-43bd-8186-7d73d673098b-kube-api-access-7rssn\") on node \"crc\" DevicePath \"\"" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.959925 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:46:54 crc kubenswrapper[4747]: I1205 20:46:54.989831 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.012563 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.013733 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.079983 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.093567 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.105607 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.294971 4747 generic.go:334] "Generic (PLEG): container finished" podID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerID="5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9" exitCode=0 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.295029 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.295027 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" event={"ID":"c7a9c2fc-a6db-4def-9938-f1da651566c8","Type":"ContainerDied","Data":"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.295691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svs8d" event={"ID":"c7a9c2fc-a6db-4def-9938-f1da651566c8","Type":"ContainerDied","Data":"ca29204fd4b307545b2042c0e04a088767b3c22226a8f6a820896be352dc550d"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.295730 4747 scope.go:117] "RemoveContainer" containerID="5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.296488 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" event={"ID":"6ce1891f-9919-4ec9-bf2f-9662d075b240","Type":"ContainerStarted","Data":"cca128f5ded938629643de7fb2fe62fd5fc2d12dbfaa913847479486919ee2b9"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.296530 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" event={"ID":"6ce1891f-9919-4ec9-bf2f-9662d075b240","Type":"ContainerStarted","Data":"fecdf94b0f38c33af8632a221274e7db0cf15b17ee842997710fbe2ccd168e1f"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.297595 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.298296 4747 generic.go:334] "Generic (PLEG): container finished" podID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerID="3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9" exitCode=0 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.298352 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hg9x" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.298364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerDied","Data":"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.298396 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hg9x" event={"ID":"5efe7146-f4a8-42c3-84d9-b974a2618203","Type":"ContainerDied","Data":"587a55990fca71b41a4e94eae5fb85c13acfeafa524c0999f5a090c01586a8da"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.300831 4747 generic.go:334] "Generic (PLEG): container finished" podID="8a99c798-6879-43bb-817c-621364a56b5a" containerID="942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834" exitCode=0 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.300912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerDied","Data":"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.300928 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g28vx" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.300945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g28vx" event={"ID":"8a99c798-6879-43bb-817c-621364a56b5a","Type":"ContainerDied","Data":"8839d2bfcec0ae27284e273aa74a3e71e35b0f066cca165c63a794d018a6894c"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.302335 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.305665 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerID="ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95" exitCode=0 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.305732 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerDied","Data":"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.305758 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q278l" event={"ID":"2fca7ed3-fb68-43bd-8186-7d73d673098b","Type":"ContainerDied","Data":"8ea76ffa0e0f9366fed4983dbfdbfab810d3894af2abcffd1a86de7e09892312"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.305776 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q278l" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.308386 4747 generic.go:334] "Generic (PLEG): container finished" podID="6dbcd101-ff16-4970-8433-37b2576e551b" containerID="45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9" exitCode=0 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.308417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerDied","Data":"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.308439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6p67" event={"ID":"6dbcd101-ff16-4970-8433-37b2576e551b","Type":"ContainerDied","Data":"2d51a3edaea1e4d19cd0208b85d37176ef750c7c0421ef151cc4a1c1d83dc1f7"} Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.308490 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6p67" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.309777 4747 scope.go:117] "RemoveContainer" containerID="5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.312048 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9\": container with ID starting with 5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9 not found: ID does not exist" containerID="5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.312086 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9"} err="failed to get container status \"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9\": rpc error: code = NotFound desc = could not find container \"5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9\": container with ID starting with 5be40beabbf1a9fec4a9cf0e2a7e3daf3bdea4800a7ade444c8e2db3997135a9 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.312113 4747 scope.go:117] "RemoveContainer" containerID="3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.327062 4747 scope.go:117] "RemoveContainer" containerID="a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.342499 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m4nzv" podStartSLOduration=1.342480838 podStartE2EDuration="1.342480838s" podCreationTimestamp="2025-12-05 20:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:46:55.326105029 +0000 UTC m=+285.793412527" watchObservedRunningTime="2025-12-05 20:46:55.342480838 +0000 UTC m=+285.809788326" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.351366 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.359893 4747 scope.go:117] "RemoveContainer" containerID="807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.376844 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.383958 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.391915 4747 scope.go:117] "RemoveContainer" containerID="3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.394248 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9\": container with ID starting with 3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9 not found: ID does not exist" containerID="3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.394408 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9"} err="failed to get container status \"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9\": rpc error: code = NotFound desc = could not find container \"3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9\": container with ID starting with 3cd46fe1ecb18ec2ded7b8ee6febfe1968fb9efac652d2c22215b1c95c3561c9 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.394549 4747 scope.go:117] "RemoveContainer" containerID="a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.395323 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8\": container with ID starting with a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8 not found: ID does not exist" containerID="a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.395356 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8"} err="failed to get container status \"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8\": rpc error: code = NotFound desc = could not find container \"a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8\": container with ID starting with a1c167233d836b3176eb825f574246e1a819c571d3e9562cbb6f63c82a5a64b8 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.395375 4747 scope.go:117] "RemoveContainer" containerID="807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.395651 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b\": container with ID starting with 807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b not found: ID does not exist" containerID="807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.395697 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b"} err="failed to get container status \"807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b\": rpc error: code = NotFound desc = could not find container \"807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b\": container with ID starting with 807d68d1ef3b54b0fe740ca5a53cc1d51eb388d9219085eb4655a1eb65ae426b not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.395714 4747 scope.go:117] "RemoveContainer" containerID="942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.397392 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svs8d"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.405542 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.411845 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6p67"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.413598 4747 scope.go:117] "RemoveContainer" containerID="c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.419340 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.424861 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.427890 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g28vx"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.435323 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.435386 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q278l"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.438384 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.441385 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8hg9x"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.456842 4747 scope.go:117] "RemoveContainer" containerID="badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.469342 4747 scope.go:117] "RemoveContainer" containerID="942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.469929 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834\": container with ID starting with 942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834 not found: ID does not exist" containerID="942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.469962 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834"} err="failed to get container status \"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834\": rpc error: code = NotFound desc = could not find container \"942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834\": container with ID starting with 942667f8c50bc07db797a2535584fa180d6fc5154f7deb9dc692b902ce902834 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.470032 4747 scope.go:117] "RemoveContainer" containerID="c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.470335 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81\": container with ID starting with c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81 not found: ID does not exist" containerID="c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.470395 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81"} err="failed to get container status \"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81\": rpc error: code = NotFound desc = could not find container \"c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81\": container with ID starting with c200ce3495a99f197464134557fb7268689f92818767609da8d57c7493d4ec81 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.470432 4747 scope.go:117] "RemoveContainer" containerID="badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.470836 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973\": container with ID starting with badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973 not found: ID does not exist" containerID="badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.470860 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973"} err="failed to get container status \"badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973\": rpc error: code = NotFound desc = could not find container \"badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973\": container with ID starting with badf3934e97b029b0428fe3f40049e9d790a0000f0245107d393c6a31be2b973 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.470874 4747 scope.go:117] "RemoveContainer" containerID="ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.482676 4747 scope.go:117] "RemoveContainer" containerID="5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.494413 4747 scope.go:117] "RemoveContainer" containerID="74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.513220 4747 scope.go:117] "RemoveContainer" containerID="ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.513727 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95\": container with ID starting with ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95 not found: ID does not exist" containerID="ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.513756 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95"} err="failed to get container status \"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95\": rpc error: code = NotFound desc = could not find container \"ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95\": container with ID starting with ead1e49f8085bb1c5fca6294600f16a84526de5c0e40ab1c440a788fa7f1bd95 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.513779 4747 scope.go:117] "RemoveContainer" containerID="5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.514204 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6\": container with ID starting with 5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6 not found: ID does not exist" containerID="5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.514226 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6"} err="failed to get container status \"5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6\": rpc error: code = NotFound desc = could not find container \"5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6\": container with ID starting with 5b1dd639fc002fc05dd37613fe7282de80c0d6176544bdd1836a501611f3f1f6 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.514258 4747 scope.go:117] "RemoveContainer" containerID="74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.514670 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31\": container with ID starting with 74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31 not found: ID does not exist" containerID="74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.514688 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31"} err="failed to get container status \"74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31\": rpc error: code = NotFound desc = could not find container \"74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31\": container with ID starting with 74d2af20534657ec1dc752a6d005e2292bf9e00a4fa5faa654289b530fe98c31 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.514700 4747 scope.go:117] "RemoveContainer" containerID="45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.527543 4747 scope.go:117] "RemoveContainer" containerID="695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.542329 4747 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.542417 4747 scope.go:117] "RemoveContainer" containerID="ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.542745 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34" gracePeriod=5 Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.557835 4747 scope.go:117] "RemoveContainer" containerID="45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.558302 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9\": container with ID starting with 45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9 not found: ID does not exist" containerID="45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.558346 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9"} err="failed to get container status \"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9\": rpc error: code = NotFound desc = could not find container \"45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9\": container with ID starting with 45286baeaa7e47334adb49042feacd4d1e3c32b9b6280b6c9fb30314b8a80ea9 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.558376 4747 scope.go:117] "RemoveContainer" containerID="695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.558806 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893\": container with ID starting with 695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893 not found: ID does not exist" containerID="695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.558845 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893"} err="failed to get container status \"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893\": rpc error: code = NotFound desc = could not find container \"695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893\": container with ID starting with 695e055b1af5cec2e9f9b6c8054a928c19ec95a32178940e7a99b25159037893 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.558859 4747 scope.go:117] "RemoveContainer" containerID="ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989" Dec 05 20:46:55 crc kubenswrapper[4747]: E1205 20:46:55.559079 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989\": container with ID starting with ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989 not found: ID does not exist" containerID="ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.559101 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989"} err="failed to get container status \"ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989\": rpc error: code = NotFound desc = could not find container \"ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989\": container with ID starting with ff6850cb37a533c06e08cdd4d6b1d3aa5e2da73393d4a6c7d778a3a682fbc989 not found: ID does not exist" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.565003 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.608107 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.644294 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.724852 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.732163 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.804576 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.845632 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" path="/var/lib/kubelet/pods/2fca7ed3-fb68-43bd-8186-7d73d673098b/volumes" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.846226 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" path="/var/lib/kubelet/pods/5efe7146-f4a8-42c3-84d9-b974a2618203/volumes" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.846843 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" path="/var/lib/kubelet/pods/6dbcd101-ff16-4970-8433-37b2576e551b/volumes" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.847823 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a99c798-6879-43bb-817c-621364a56b5a" path="/var/lib/kubelet/pods/8a99c798-6879-43bb-817c-621364a56b5a/volumes" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.848404 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" path="/var/lib/kubelet/pods/c7a9c2fc-a6db-4def-9938-f1da651566c8/volumes" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.903879 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:46:55 crc kubenswrapper[4747]: I1205 20:46:55.949097 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.053182 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.083321 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.087026 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.117179 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.212350 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.251334 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.331078 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.451825 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.516969 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.548338 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.660222 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.726542 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.735859 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.752515 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.779792 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.854970 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.885680 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:46:56 crc kubenswrapper[4747]: I1205 20:46:56.954008 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.159124 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.182300 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.215398 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.239905 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.459802 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.642806 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:46:57 crc kubenswrapper[4747]: I1205 20:46:57.848503 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.020422 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.025211 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.052115 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.057242 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.246180 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.268060 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.336337 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.405557 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.524937 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.816527 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:46:58 crc kubenswrapper[4747]: I1205 20:46:58.866895 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:46:59 crc kubenswrapper[4747]: I1205 20:46:59.011505 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:46:59 crc kubenswrapper[4747]: I1205 20:46:59.490435 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.117371 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.117759 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.236704 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.236798 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.236787 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238041 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238111 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238416 4747 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238462 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238501 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.238626 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.249208 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.339241 4747 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.339288 4747 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.339307 4747 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.339323 4747 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.350988 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.351046 4747 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34" exitCode=137 Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.351093 4747 scope.go:117] "RemoveContainer" containerID="f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.351170 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.410348 4747 scope.go:117] "RemoveContainer" containerID="f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34" Dec 05 20:47:01 crc kubenswrapper[4747]: E1205 20:47:01.411678 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34\": container with ID starting with f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34 not found: ID does not exist" containerID="f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.411785 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34"} err="failed to get container status \"f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34\": rpc error: code = NotFound desc = could not find container \"f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34\": container with ID starting with f3bd5567d323b08de7ac034fd3d50f2032f4a4934b01235a2b915dfb376d2e34 not found: ID does not exist" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.846956 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.847319 4747 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.856189 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.856220 4747 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c18c1a92-c25e-4364-b5d1-3540694a0e32" Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.859158 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:47:01 crc kubenswrapper[4747]: I1205 20:47:01.859193 4747 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c18c1a92-c25e-4364-b5d1-3540694a0e32" Dec 05 20:47:18 crc kubenswrapper[4747]: I1205 20:47:18.715824 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:47:19 crc kubenswrapper[4747]: I1205 20:47:19.667424 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:47:19 crc kubenswrapper[4747]: I1205 20:47:19.667671 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerName="controller-manager" containerID="cri-o://56995b51bd9d2635e9aef2720b9f0e9863c017af87f034d9042e1a8eff18de6d" gracePeriod=30 Dec 05 20:47:19 crc kubenswrapper[4747]: I1205 20:47:19.757560 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:47:19 crc kubenswrapper[4747]: I1205 20:47:19.757808 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" podUID="1018779a-353b-4530-84e1-b52a044d69d5" containerName="route-controller-manager" containerID="cri-o://3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5" gracePeriod=30 Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.185111 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.223818 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert\") pod \"1018779a-353b-4530-84e1-b52a044d69d5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.223921 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config\") pod \"1018779a-353b-4530-84e1-b52a044d69d5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.223950 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prmgd\" (UniqueName: \"kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd\") pod \"1018779a-353b-4530-84e1-b52a044d69d5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.224002 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca\") pod \"1018779a-353b-4530-84e1-b52a044d69d5\" (UID: \"1018779a-353b-4530-84e1-b52a044d69d5\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.224952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "1018779a-353b-4530-84e1-b52a044d69d5" (UID: "1018779a-353b-4530-84e1-b52a044d69d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.225425 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config" (OuterVolumeSpecName: "config") pod "1018779a-353b-4530-84e1-b52a044d69d5" (UID: "1018779a-353b-4530-84e1-b52a044d69d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.230353 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd" (OuterVolumeSpecName: "kube-api-access-prmgd") pod "1018779a-353b-4530-84e1-b52a044d69d5" (UID: "1018779a-353b-4530-84e1-b52a044d69d5"). InnerVolumeSpecName "kube-api-access-prmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.233429 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1018779a-353b-4530-84e1-b52a044d69d5" (UID: "1018779a-353b-4530-84e1-b52a044d69d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.325685 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1018779a-353b-4530-84e1-b52a044d69d5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.325720 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.325730 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prmgd\" (UniqueName: \"kubernetes.io/projected/1018779a-353b-4530-84e1-b52a044d69d5-kube-api-access-prmgd\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.325739 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1018779a-353b-4530-84e1-b52a044d69d5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.461428 4747 generic.go:334] "Generic (PLEG): container finished" podID="1018779a-353b-4530-84e1-b52a044d69d5" containerID="3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5" exitCode=0 Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.461476 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" event={"ID":"1018779a-353b-4530-84e1-b52a044d69d5","Type":"ContainerDied","Data":"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5"} Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.461499 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.461514 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5" event={"ID":"1018779a-353b-4530-84e1-b52a044d69d5","Type":"ContainerDied","Data":"1fec1e715976632c6c419ba3b2ddb58dbf9d0a3eb43230f2da12e4f196c82757"} Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.461554 4747 scope.go:117] "RemoveContainer" containerID="3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.463629 4747 generic.go:334] "Generic (PLEG): container finished" podID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerID="56995b51bd9d2635e9aef2720b9f0e9863c017af87f034d9042e1a8eff18de6d" exitCode=0 Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.463662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" event={"ID":"ec0e9905-637c-4a71-b816-12fefbb801d2","Type":"ContainerDied","Data":"56995b51bd9d2635e9aef2720b9f0e9863c017af87f034d9042e1a8eff18de6d"} Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.468912 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.474147 4747 scope.go:117] "RemoveContainer" containerID="3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5" Dec 05 20:47:20 crc kubenswrapper[4747]: E1205 20:47:20.474517 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5\": container with ID starting with 3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5 not found: ID does not exist" containerID="3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.474544 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5"} err="failed to get container status \"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5\": rpc error: code = NotFound desc = could not find container \"3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5\": container with ID starting with 3eb0d0adfe8d278dcdcc069ddd2235463286944d2b75a358a7baeb8e5e6900f5 not found: ID does not exist" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.502434 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.504882 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l4dc5"] Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528128 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca\") pod \"ec0e9905-637c-4a71-b816-12fefbb801d2\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528196 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config\") pod \"ec0e9905-637c-4a71-b816-12fefbb801d2\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528259 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles\") pod \"ec0e9905-637c-4a71-b816-12fefbb801d2\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528316 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq2fm\" (UniqueName: \"kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm\") pod \"ec0e9905-637c-4a71-b816-12fefbb801d2\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert\") pod \"ec0e9905-637c-4a71-b816-12fefbb801d2\" (UID: \"ec0e9905-637c-4a71-b816-12fefbb801d2\") " Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.528999 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec0e9905-637c-4a71-b816-12fefbb801d2" (UID: "ec0e9905-637c-4a71-b816-12fefbb801d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.529032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ec0e9905-637c-4a71-b816-12fefbb801d2" (UID: "ec0e9905-637c-4a71-b816-12fefbb801d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.529085 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config" (OuterVolumeSpecName: "config") pod "ec0e9905-637c-4a71-b816-12fefbb801d2" (UID: "ec0e9905-637c-4a71-b816-12fefbb801d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.532437 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm" (OuterVolumeSpecName: "kube-api-access-xq2fm") pod "ec0e9905-637c-4a71-b816-12fefbb801d2" (UID: "ec0e9905-637c-4a71-b816-12fefbb801d2"). InnerVolumeSpecName "kube-api-access-xq2fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.533006 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec0e9905-637c-4a71-b816-12fefbb801d2" (UID: "ec0e9905-637c-4a71-b816-12fefbb801d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.629809 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.629847 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq2fm\" (UniqueName: \"kubernetes.io/projected/ec0e9905-637c-4a71-b816-12fefbb801d2-kube-api-access-xq2fm\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.629858 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0e9905-637c-4a71-b816-12fefbb801d2-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.629867 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:20 crc kubenswrapper[4747]: I1205 20:47:20.629876 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0e9905-637c-4a71-b816-12fefbb801d2-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.471424 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" event={"ID":"ec0e9905-637c-4a71-b816-12fefbb801d2","Type":"ContainerDied","Data":"54d8f408ddc8b683cc555741bb53001f9eaf42b542ac78598e17aae65e55c3d3"} Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.471770 4747 scope.go:117] "RemoveContainer" containerID="56995b51bd9d2635e9aef2720b9f0e9863c017af87f034d9042e1a8eff18de6d" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.471479 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sntqs" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.499679 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.501421 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sntqs"] Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.850295 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1018779a-353b-4530-84e1-b52a044d69d5" path="/var/lib/kubelet/pods/1018779a-353b-4530-84e1-b52a044d69d5/volumes" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.851057 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" path="/var/lib/kubelet/pods/ec0e9905-637c-4a71-b816-12fefbb801d2/volumes" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.873886 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874335 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874377 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874409 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874428 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874450 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874467 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874501 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874518 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874537 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874552 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874570 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874618 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874641 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874657 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874674 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874691 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874712 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874729 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874750 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerName="controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874766 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerName="controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874786 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874801 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874830 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874846 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="extract-utilities" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874867 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874883 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="extract-content" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874901 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1018779a-353b-4530-84e1-b52a044d69d5" containerName="route-controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874918 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1018779a-353b-4530-84e1-b52a044d69d5" containerName="route-controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874936 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874957 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: E1205 20:47:21.874981 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.874998 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875199 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fca7ed3-fb68-43bd-8186-7d73d673098b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875228 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efe7146-f4a8-42c3-84d9-b974a2618203" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875254 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a99c798-6879-43bb-817c-621364a56b5a" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875278 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1018779a-353b-4530-84e1-b52a044d69d5" containerName="route-controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875300 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a9c2fc-a6db-4def-9938-f1da651566c8" containerName="marketplace-operator" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875324 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec0e9905-637c-4a71-b816-12fefbb801d2" containerName="controller-manager" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875345 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbcd101-ff16-4970-8433-37b2576e551b" containerName="registry-server" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.875360 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.876399 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.876565 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5"] Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.877279 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.881403 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.882850 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.883121 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.883406 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.885021 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.885111 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.885803 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.886275 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.886504 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.886932 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.887141 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.887265 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.898866 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.902398 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5"] Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.904915 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947070 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-client-ca\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947189 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947288 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtcnk\" (UniqueName: \"kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947416 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947644 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cth\" (UniqueName: \"kubernetes.io/projected/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-kube-api-access-q8cth\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947730 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-serving-cert\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947832 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:21 crc kubenswrapper[4747]: I1205 20:47:21.947861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-config\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048679 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtcnk\" (UniqueName: \"kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048785 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048816 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cth\" (UniqueName: \"kubernetes.io/projected/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-kube-api-access-q8cth\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-serving-cert\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048880 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-config\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048918 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-client-ca\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.048955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.050330 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.051183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-config\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.051216 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.051425 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.051565 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-client-ca\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.055418 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.055802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-serving-cert\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.065118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtcnk\" (UniqueName: \"kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk\") pod \"controller-manager-856f74b64f-rlfnl\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.067812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cth\" (UniqueName: \"kubernetes.io/projected/3d3e8f11-d15e-4ab9-bd2d-80c288da67dd-kube-api-access-q8cth\") pod \"route-controller-manager-58f7f46d9f-j4hr5\" (UID: \"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd\") " pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.210361 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.238610 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.482215 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:22 crc kubenswrapper[4747]: I1205 20:47:22.517195 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5"] Dec 05 20:47:22 crc kubenswrapper[4747]: W1205 20:47:22.522485 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3e8f11_d15e_4ab9_bd2d_80c288da67dd.slice/crio-eb9e7daddbf3284a424a0822c5a7b279f334f9f8ab729dba5b5a0848f4066ea6 WatchSource:0}: Error finding container eb9e7daddbf3284a424a0822c5a7b279f334f9f8ab729dba5b5a0848f4066ea6: Status 404 returned error can't find the container with id eb9e7daddbf3284a424a0822c5a7b279f334f9f8ab729dba5b5a0848f4066ea6 Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.486265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" event={"ID":"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd","Type":"ContainerStarted","Data":"20818402969bf4a52adc44d28fa610ff0baa58396a9cd01be200267a6ab2b21c"} Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.486767 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" event={"ID":"3d3e8f11-d15e-4ab9-bd2d-80c288da67dd","Type":"ContainerStarted","Data":"eb9e7daddbf3284a424a0822c5a7b279f334f9f8ab729dba5b5a0848f4066ea6"} Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.486794 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.487936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" event={"ID":"aed1d1c9-d7a2-410c-acb3-1c380394ea54","Type":"ContainerStarted","Data":"5e5d14b8309629786092dab66bb1308a0164e6d0c908077f52f4ced40338ac14"} Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.487968 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" event={"ID":"aed1d1c9-d7a2-410c-acb3-1c380394ea54","Type":"ContainerStarted","Data":"130988fb1768ba53405d26f5da2bfd5dc3ae3e0222d02cf31411d77e2404fe6f"} Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.488203 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.492051 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.494564 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.507300 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58f7f46d9f-j4hr5" podStartSLOduration=4.507275153 podStartE2EDuration="4.507275153s" podCreationTimestamp="2025-12-05 20:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:47:23.503802514 +0000 UTC m=+313.971110042" watchObservedRunningTime="2025-12-05 20:47:23.507275153 +0000 UTC m=+313.974582671" Dec 05 20:47:23 crc kubenswrapper[4747]: I1205 20:47:23.524935 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" podStartSLOduration=4.524901448 podStartE2EDuration="4.524901448s" podCreationTimestamp="2025-12-05 20:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:47:23.523390405 +0000 UTC m=+313.990697893" watchObservedRunningTime="2025-12-05 20:47:23.524901448 +0000 UTC m=+313.992209006" Dec 05 20:47:39 crc kubenswrapper[4747]: I1205 20:47:39.663230 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:39 crc kubenswrapper[4747]: I1205 20:47:39.664197 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" podUID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" containerName="controller-manager" containerID="cri-o://5e5d14b8309629786092dab66bb1308a0164e6d0c908077f52f4ced40338ac14" gracePeriod=30 Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.577544 4747 generic.go:334] "Generic (PLEG): container finished" podID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" containerID="5e5d14b8309629786092dab66bb1308a0164e6d0c908077f52f4ced40338ac14" exitCode=0 Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.577883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" event={"ID":"aed1d1c9-d7a2-410c-acb3-1c380394ea54","Type":"ContainerDied","Data":"5e5d14b8309629786092dab66bb1308a0164e6d0c908077f52f4ced40338ac14"} Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.734750 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.770181 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd"] Dec 05 20:47:40 crc kubenswrapper[4747]: E1205 20:47:40.770429 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" containerName="controller-manager" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.770444 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" containerName="controller-manager" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.770568 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" containerName="controller-manager" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.771035 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.793701 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd"] Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.833872 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-config\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.834108 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-proxy-ca-bundles\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.834220 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hq5\" (UniqueName: \"kubernetes.io/projected/54820c10-b7f8-472e-a8d9-5ef4815b53a0-kube-api-access-s9hq5\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.834272 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54820c10-b7f8-472e-a8d9-5ef4815b53a0-serving-cert\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.834378 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-client-ca\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert\") pod \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935397 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca\") pod \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config\") pod \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles\") pod \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtcnk\" (UniqueName: \"kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk\") pod \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\" (UID: \"aed1d1c9-d7a2-410c-acb3-1c380394ea54\") " Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hq5\" (UniqueName: \"kubernetes.io/projected/54820c10-b7f8-472e-a8d9-5ef4815b53a0-kube-api-access-s9hq5\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935880 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54820c10-b7f8-472e-a8d9-5ef4815b53a0-serving-cert\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.935975 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-client-ca\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.936075 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-config\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.936354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-proxy-ca-bundles\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.936367 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca" (OuterVolumeSpecName: "client-ca") pod "aed1d1c9-d7a2-410c-acb3-1c380394ea54" (UID: "aed1d1c9-d7a2-410c-acb3-1c380394ea54"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.936971 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aed1d1c9-d7a2-410c-acb3-1c380394ea54" (UID: "aed1d1c9-d7a2-410c-acb3-1c380394ea54"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.938636 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-client-ca\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.937694 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config" (OuterVolumeSpecName: "config") pod "aed1d1c9-d7a2-410c-acb3-1c380394ea54" (UID: "aed1d1c9-d7a2-410c-acb3-1c380394ea54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.938986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-proxy-ca-bundles\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.941219 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54820c10-b7f8-472e-a8d9-5ef4815b53a0-config\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.948828 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aed1d1c9-d7a2-410c-acb3-1c380394ea54" (UID: "aed1d1c9-d7a2-410c-acb3-1c380394ea54"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.949640 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk" (OuterVolumeSpecName: "kube-api-access-wtcnk") pod "aed1d1c9-d7a2-410c-acb3-1c380394ea54" (UID: "aed1d1c9-d7a2-410c-acb3-1c380394ea54"). InnerVolumeSpecName "kube-api-access-wtcnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.957941 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54820c10-b7f8-472e-a8d9-5ef4815b53a0-serving-cert\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:40 crc kubenswrapper[4747]: I1205 20:47:40.966796 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hq5\" (UniqueName: \"kubernetes.io/projected/54820c10-b7f8-472e-a8d9-5ef4815b53a0-kube-api-access-s9hq5\") pod \"controller-manager-54d5cc9f99-dsqxd\" (UID: \"54820c10-b7f8-472e-a8d9-5ef4815b53a0\") " pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.038141 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.038463 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.038478 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aed1d1c9-d7a2-410c-acb3-1c380394ea54-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.038490 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtcnk\" (UniqueName: \"kubernetes.io/projected/aed1d1c9-d7a2-410c-acb3-1c380394ea54-kube-api-access-wtcnk\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.038530 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed1d1c9-d7a2-410c-acb3-1c380394ea54-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.098014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.349472 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd"] Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.583965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" event={"ID":"aed1d1c9-d7a2-410c-acb3-1c380394ea54","Type":"ContainerDied","Data":"130988fb1768ba53405d26f5da2bfd5dc3ae3e0222d02cf31411d77e2404fe6f"} Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.584076 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-rlfnl" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.584533 4747 scope.go:117] "RemoveContainer" containerID="5e5d14b8309629786092dab66bb1308a0164e6d0c908077f52f4ced40338ac14" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.586020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" event={"ID":"54820c10-b7f8-472e-a8d9-5ef4815b53a0","Type":"ContainerStarted","Data":"149569a50c58e8ec3348aa87e87822e03b5b1e90a454a74d8eb0283299c3de2d"} Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.586083 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" event={"ID":"54820c10-b7f8-472e-a8d9-5ef4815b53a0","Type":"ContainerStarted","Data":"de719d5ffabb6a5f98eb886745915c46c88e4bc7d032b822d0e78bfda0278ed2"} Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.586344 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.591096 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.612708 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54d5cc9f99-dsqxd" podStartSLOduration=2.612689545 podStartE2EDuration="2.612689545s" podCreationTimestamp="2025-12-05 20:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:47:41.609590376 +0000 UTC m=+332.076897864" watchObservedRunningTime="2025-12-05 20:47:41.612689545 +0000 UTC m=+332.079997033" Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.645019 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.650532 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-rlfnl"] Dec 05 20:47:41 crc kubenswrapper[4747]: I1205 20:47:41.846396 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed1d1c9-d7a2-410c-acb3-1c380394ea54" path="/var/lib/kubelet/pods/aed1d1c9-d7a2-410c-acb3-1c380394ea54/volumes" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.776817 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnfnj"] Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.779027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.781633 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.788507 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnfnj"] Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.909770 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-catalog-content\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.909814 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwf2k\" (UniqueName: \"kubernetes.io/projected/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-kube-api-access-nwf2k\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.909883 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-utilities\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.971458 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d74r5"] Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.972369 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.975048 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:47:59 crc kubenswrapper[4747]: I1205 20:47:59.988137 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d74r5"] Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.013710 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-catalog-content\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.013782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwf2k\" (UniqueName: \"kubernetes.io/projected/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-kube-api-access-nwf2k\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.013860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-utilities\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.014910 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-utilities\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.015101 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-catalog-content\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.037465 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwf2k\" (UniqueName: \"kubernetes.io/projected/96a86291-ce6c-4ab0-afc5-a87fbbc0fa78-kube-api-access-nwf2k\") pod \"community-operators-cnfnj\" (UID: \"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78\") " pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.104066 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.115810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-utilities\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.115927 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tr6\" (UniqueName: \"kubernetes.io/projected/5799e2a0-e208-4e0b-b757-d65fe1f2f859-kube-api-access-f6tr6\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.115956 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-catalog-content\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.217624 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-utilities\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.217694 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tr6\" (UniqueName: \"kubernetes.io/projected/5799e2a0-e208-4e0b-b757-d65fe1f2f859-kube-api-access-f6tr6\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.217718 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-catalog-content\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.218105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-catalog-content\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.218226 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799e2a0-e208-4e0b-b757-d65fe1f2f859-utilities\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.242003 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tr6\" (UniqueName: \"kubernetes.io/projected/5799e2a0-e208-4e0b-b757-d65fe1f2f859-kube-api-access-f6tr6\") pod \"redhat-marketplace-d74r5\" (UID: \"5799e2a0-e208-4e0b-b757-d65fe1f2f859\") " pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.296428 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.478200 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9lnr"] Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.479984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.484428 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9lnr"] Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.535485 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnfnj"] Dec 05 20:48:00 crc kubenswrapper[4747]: W1205 20:48:00.537005 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a86291_ce6c_4ab0_afc5_a87fbbc0fa78.slice/crio-22693b678e98626f91188a0c45252159da7d2f12d94bb582c0d5fa063e0751d2 WatchSource:0}: Error finding container 22693b678e98626f91188a0c45252159da7d2f12d94bb582c0d5fa063e0751d2: Status 404 returned error can't find the container with id 22693b678e98626f91188a0c45252159da7d2f12d94bb582c0d5fa063e0751d2 Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623011 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-trusted-ca\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623060 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-bound-sa-token\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623100 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623123 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c810228-9627-49a8-87e6-516175280d1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4bdt\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-kube-api-access-v4bdt\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623327 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-registry-certificates\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.623368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-registry-tls\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.624534 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c810228-9627-49a8-87e6-516175280d1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.650737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.711073 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnj" event={"ID":"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78","Type":"ContainerStarted","Data":"22693b678e98626f91188a0c45252159da7d2f12d94bb582c0d5fa063e0751d2"} Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725807 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-trusted-ca\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-bound-sa-token\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725872 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c810228-9627-49a8-87e6-516175280d1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725900 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4bdt\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-kube-api-access-v4bdt\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725914 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-registry-certificates\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725934 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-registry-tls\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.725953 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c810228-9627-49a8-87e6-516175280d1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.726844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c810228-9627-49a8-87e6-516175280d1b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.728382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-trusted-ca\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.728769 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c810228-9627-49a8-87e6-516175280d1b-registry-certificates\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.731611 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-registry-tls\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.731706 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c810228-9627-49a8-87e6-516175280d1b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.742485 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-bound-sa-token\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.743084 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4bdt\" (UniqueName: \"kubernetes.io/projected/4c810228-9627-49a8-87e6-516175280d1b-kube-api-access-v4bdt\") pod \"image-registry-66df7c8f76-f9lnr\" (UID: \"4c810228-9627-49a8-87e6-516175280d1b\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.770616 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d74r5"] Dec 05 20:48:00 crc kubenswrapper[4747]: W1205 20:48:00.778394 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5799e2a0_e208_4e0b_b757_d65fe1f2f859.slice/crio-9c96982edc1985e4bceff4a216ff7325a23505d53c4f7356be49b66c26e26c0c WatchSource:0}: Error finding container 9c96982edc1985e4bceff4a216ff7325a23505d53c4f7356be49b66c26e26c0c: Status 404 returned error can't find the container with id 9c96982edc1985e4bceff4a216ff7325a23505d53c4f7356be49b66c26e26c0c Dec 05 20:48:00 crc kubenswrapper[4747]: I1205 20:48:00.800937 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.219306 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9lnr"] Dec 05 20:48:01 crc kubenswrapper[4747]: W1205 20:48:01.229869 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c810228_9627_49a8_87e6_516175280d1b.slice/crio-d700efc28fdc7002265f74ced253e131bd72261af17bf8ffbe8a5240a983e0d2 WatchSource:0}: Error finding container d700efc28fdc7002265f74ced253e131bd72261af17bf8ffbe8a5240a983e0d2: Status 404 returned error can't find the container with id d700efc28fdc7002265f74ced253e131bd72261af17bf8ffbe8a5240a983e0d2 Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.719672 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" event={"ID":"4c810228-9627-49a8-87e6-516175280d1b","Type":"ContainerStarted","Data":"b900d0220ce8ded6c41d8ca06eaa227ce20f2dd300f725dfc5ec9c11a84e60a5"} Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.720117 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.720178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" event={"ID":"4c810228-9627-49a8-87e6-516175280d1b","Type":"ContainerStarted","Data":"d700efc28fdc7002265f74ced253e131bd72261af17bf8ffbe8a5240a983e0d2"} Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.722777 4747 generic.go:334] "Generic (PLEG): container finished" podID="5799e2a0-e208-4e0b-b757-d65fe1f2f859" containerID="18a9d877af44ee7c68e0c5d72487a0c6250dca76f04fcf7b54a45f6faf31ed0c" exitCode=0 Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.722915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d74r5" event={"ID":"5799e2a0-e208-4e0b-b757-d65fe1f2f859","Type":"ContainerDied","Data":"18a9d877af44ee7c68e0c5d72487a0c6250dca76f04fcf7b54a45f6faf31ed0c"} Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.722957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d74r5" event={"ID":"5799e2a0-e208-4e0b-b757-d65fe1f2f859","Type":"ContainerStarted","Data":"9c96982edc1985e4bceff4a216ff7325a23505d53c4f7356be49b66c26e26c0c"} Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.726434 4747 generic.go:334] "Generic (PLEG): container finished" podID="96a86291-ce6c-4ab0-afc5-a87fbbc0fa78" containerID="0234d6b57e4c2540863970fa7ed027b753b52d5f1eefdaf0f44d2f46879caeee" exitCode=0 Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.726497 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnj" event={"ID":"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78","Type":"ContainerDied","Data":"0234d6b57e4c2540863970fa7ed027b753b52d5f1eefdaf0f44d2f46879caeee"} Dec 05 20:48:01 crc kubenswrapper[4747]: I1205 20:48:01.753766 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" podStartSLOduration=1.753738494 podStartE2EDuration="1.753738494s" podCreationTimestamp="2025-12-05 20:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:48:01.751272631 +0000 UTC m=+352.218580119" watchObservedRunningTime="2025-12-05 20:48:01.753738494 +0000 UTC m=+352.221046012" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.386663 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mxk7k"] Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.391957 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.394530 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxk7k"] Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.396506 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.552117 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdqfm\" (UniqueName: \"kubernetes.io/projected/b295f21e-14da-4faa-97a2-6fa2d1f9702a-kube-api-access-qdqfm\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.552398 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-catalog-content\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.552477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-utilities\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.587752 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-djp6w"] Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.588833 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.591313 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.597486 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djp6w"] Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653330 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-catalog-content\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653401 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-utilities\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653441 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdqfm\" (UniqueName: \"kubernetes.io/projected/b295f21e-14da-4faa-97a2-6fa2d1f9702a-kube-api-access-qdqfm\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-catalog-content\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653502 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-utilities\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwshw\" (UniqueName: \"kubernetes.io/projected/10893b71-c0c0-4955-8d6f-eecbd1e69d68-kube-api-access-hwshw\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-catalog-content\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.653978 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b295f21e-14da-4faa-97a2-6fa2d1f9702a-utilities\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.670960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdqfm\" (UniqueName: \"kubernetes.io/projected/b295f21e-14da-4faa-97a2-6fa2d1f9702a-kube-api-access-qdqfm\") pod \"redhat-operators-mxk7k\" (UID: \"b295f21e-14da-4faa-97a2-6fa2d1f9702a\") " pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.737516 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.740071 4747 generic.go:334] "Generic (PLEG): container finished" podID="5799e2a0-e208-4e0b-b757-d65fe1f2f859" containerID="6fcd146539cec104c084ae6d9df0c95e0ad0a897ff156b0cd9e34a3f890ae02b" exitCode=0 Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.740191 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d74r5" event={"ID":"5799e2a0-e208-4e0b-b757-d65fe1f2f859","Type":"ContainerDied","Data":"6fcd146539cec104c084ae6d9df0c95e0ad0a897ff156b0cd9e34a3f890ae02b"} Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.743879 4747 generic.go:334] "Generic (PLEG): container finished" podID="96a86291-ce6c-4ab0-afc5-a87fbbc0fa78" containerID="02049b26bdd47094d1ebb082ce71cc3fb8625551a10e0348cc5762f1f9b476bf" exitCode=0 Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.743931 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnj" event={"ID":"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78","Type":"ContainerDied","Data":"02049b26bdd47094d1ebb082ce71cc3fb8625551a10e0348cc5762f1f9b476bf"} Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.756051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-catalog-content\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.756204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-utilities\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.756286 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwshw\" (UniqueName: \"kubernetes.io/projected/10893b71-c0c0-4955-8d6f-eecbd1e69d68-kube-api-access-hwshw\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.757548 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-catalog-content\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.758047 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10893b71-c0c0-4955-8d6f-eecbd1e69d68-utilities\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:02 crc kubenswrapper[4747]: I1205 20:48:02.786450 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwshw\" (UniqueName: \"kubernetes.io/projected/10893b71-c0c0-4955-8d6f-eecbd1e69d68-kube-api-access-hwshw\") pod \"certified-operators-djp6w\" (UID: \"10893b71-c0c0-4955-8d6f-eecbd1e69d68\") " pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.046559 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.190168 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxk7k"] Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.545208 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djp6w"] Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.751459 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d74r5" event={"ID":"5799e2a0-e208-4e0b-b757-d65fe1f2f859","Type":"ContainerStarted","Data":"ec2b88183d71d147d24891195149ced631e30a902acf976fee8497316a65bee4"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.754105 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnj" event={"ID":"96a86291-ce6c-4ab0-afc5-a87fbbc0fa78","Type":"ContainerStarted","Data":"1ce5d9b58ed9a9588c94e8a4d8d5c7ab4372bd0f827af0ff9c99dada5f5c7900"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.756154 4747 generic.go:334] "Generic (PLEG): container finished" podID="10893b71-c0c0-4955-8d6f-eecbd1e69d68" containerID="f40478d28cd90a9d411a621b26d829d8c1d2ede4e24a0a8d6cd16f220ab17aa9" exitCode=0 Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.756195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djp6w" event={"ID":"10893b71-c0c0-4955-8d6f-eecbd1e69d68","Type":"ContainerDied","Data":"f40478d28cd90a9d411a621b26d829d8c1d2ede4e24a0a8d6cd16f220ab17aa9"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.756209 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djp6w" event={"ID":"10893b71-c0c0-4955-8d6f-eecbd1e69d68","Type":"ContainerStarted","Data":"c1e770906e8cf15f84cff8603638bc0442670c3af6676599977024709cc5716c"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.758831 4747 generic.go:334] "Generic (PLEG): container finished" podID="b295f21e-14da-4faa-97a2-6fa2d1f9702a" containerID="57f3009989cc2d324d1a704b38e25f41628c34cc495374cb3d12039fa4ef8670" exitCode=0 Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.758856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxk7k" event={"ID":"b295f21e-14da-4faa-97a2-6fa2d1f9702a","Type":"ContainerDied","Data":"57f3009989cc2d324d1a704b38e25f41628c34cc495374cb3d12039fa4ef8670"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.758872 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxk7k" event={"ID":"b295f21e-14da-4faa-97a2-6fa2d1f9702a","Type":"ContainerStarted","Data":"3f666cf2f5854a85df7e402dd85b2ac0cbf1bb88aa7ddcfe3cfc24857dedffbb"} Dec 05 20:48:03 crc kubenswrapper[4747]: I1205 20:48:03.778279 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d74r5" podStartSLOduration=3.333553838 podStartE2EDuration="4.778256882s" podCreationTimestamp="2025-12-05 20:47:59 +0000 UTC" firstStartedPulling="2025-12-05 20:48:01.726416846 +0000 UTC m=+352.193724364" lastFinishedPulling="2025-12-05 20:48:03.17111988 +0000 UTC m=+353.638427408" observedRunningTime="2025-12-05 20:48:03.771983211 +0000 UTC m=+354.239290719" watchObservedRunningTime="2025-12-05 20:48:03.778256882 +0000 UTC m=+354.245564400" Dec 05 20:48:04 crc kubenswrapper[4747]: I1205 20:48:04.767521 4747 generic.go:334] "Generic (PLEG): container finished" podID="10893b71-c0c0-4955-8d6f-eecbd1e69d68" containerID="968084436ca097e784c13edba3381bf24e79923697a8baf73485012b8f200f99" exitCode=0 Dec 05 20:48:04 crc kubenswrapper[4747]: I1205 20:48:04.767918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djp6w" event={"ID":"10893b71-c0c0-4955-8d6f-eecbd1e69d68","Type":"ContainerDied","Data":"968084436ca097e784c13edba3381bf24e79923697a8baf73485012b8f200f99"} Dec 05 20:48:04 crc kubenswrapper[4747]: I1205 20:48:04.770543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxk7k" event={"ID":"b295f21e-14da-4faa-97a2-6fa2d1f9702a","Type":"ContainerStarted","Data":"26d7d1795955489abf28d51259ac024f7b691b0eb6e611d86334219d44a470e2"} Dec 05 20:48:04 crc kubenswrapper[4747]: I1205 20:48:04.789696 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnfnj" podStartSLOduration=4.302906059 podStartE2EDuration="5.789678049s" podCreationTimestamp="2025-12-05 20:47:59 +0000 UTC" firstStartedPulling="2025-12-05 20:48:01.730773317 +0000 UTC m=+352.198080825" lastFinishedPulling="2025-12-05 20:48:03.217545327 +0000 UTC m=+353.684852815" observedRunningTime="2025-12-05 20:48:03.840063242 +0000 UTC m=+354.307370730" watchObservedRunningTime="2025-12-05 20:48:04.789678049 +0000 UTC m=+355.256985537" Dec 05 20:48:05 crc kubenswrapper[4747]: I1205 20:48:05.780326 4747 generic.go:334] "Generic (PLEG): container finished" podID="b295f21e-14da-4faa-97a2-6fa2d1f9702a" containerID="26d7d1795955489abf28d51259ac024f7b691b0eb6e611d86334219d44a470e2" exitCode=0 Dec 05 20:48:05 crc kubenswrapper[4747]: I1205 20:48:05.780439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxk7k" event={"ID":"b295f21e-14da-4faa-97a2-6fa2d1f9702a","Type":"ContainerDied","Data":"26d7d1795955489abf28d51259ac024f7b691b0eb6e611d86334219d44a470e2"} Dec 05 20:48:05 crc kubenswrapper[4747]: I1205 20:48:05.784724 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djp6w" event={"ID":"10893b71-c0c0-4955-8d6f-eecbd1e69d68","Type":"ContainerStarted","Data":"ab4c352be695a3ee61b55af9e80c1bacf8b0a28099bc875ace8f2e80304b5090"} Dec 05 20:48:05 crc kubenswrapper[4747]: I1205 20:48:05.829126 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-djp6w" podStartSLOduration=2.434031616 podStartE2EDuration="3.829105771s" podCreationTimestamp="2025-12-05 20:48:02 +0000 UTC" firstStartedPulling="2025-12-05 20:48:03.757264925 +0000 UTC m=+354.224572413" lastFinishedPulling="2025-12-05 20:48:05.15233904 +0000 UTC m=+355.619646568" observedRunningTime="2025-12-05 20:48:05.826185417 +0000 UTC m=+356.293492925" watchObservedRunningTime="2025-12-05 20:48:05.829105771 +0000 UTC m=+356.296413270" Dec 05 20:48:06 crc kubenswrapper[4747]: I1205 20:48:06.221837 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:48:06 crc kubenswrapper[4747]: I1205 20:48:06.221939 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:48:06 crc kubenswrapper[4747]: I1205 20:48:06.792464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxk7k" event={"ID":"b295f21e-14da-4faa-97a2-6fa2d1f9702a","Type":"ContainerStarted","Data":"6d59885b0c40e4512cb829d54f2ee16df078085350df458501880749514d3780"} Dec 05 20:48:06 crc kubenswrapper[4747]: I1205 20:48:06.810300 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mxk7k" podStartSLOduration=2.116575512 podStartE2EDuration="4.810281906s" podCreationTimestamp="2025-12-05 20:48:02 +0000 UTC" firstStartedPulling="2025-12-05 20:48:03.759913333 +0000 UTC m=+354.227220821" lastFinishedPulling="2025-12-05 20:48:06.453619707 +0000 UTC m=+356.920927215" observedRunningTime="2025-12-05 20:48:06.809024973 +0000 UTC m=+357.276332461" watchObservedRunningTime="2025-12-05 20:48:06.810281906 +0000 UTC m=+357.277589404" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.104782 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.105768 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.143696 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.297064 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.297124 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.342549 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.870510 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnfnj" Dec 05 20:48:10 crc kubenswrapper[4747]: I1205 20:48:10.888000 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d74r5" Dec 05 20:48:12 crc kubenswrapper[4747]: I1205 20:48:12.738439 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:12 crc kubenswrapper[4747]: I1205 20:48:12.738919 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:12 crc kubenswrapper[4747]: I1205 20:48:12.795362 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:12 crc kubenswrapper[4747]: I1205 20:48:12.877018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mxk7k" Dec 05 20:48:13 crc kubenswrapper[4747]: I1205 20:48:13.047174 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:13 crc kubenswrapper[4747]: I1205 20:48:13.047268 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:13 crc kubenswrapper[4747]: I1205 20:48:13.093860 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:13 crc kubenswrapper[4747]: I1205 20:48:13.905675 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-djp6w" Dec 05 20:48:20 crc kubenswrapper[4747]: I1205 20:48:20.808325 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f9lnr" Dec 05 20:48:20 crc kubenswrapper[4747]: I1205 20:48:20.881072 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:48:36 crc kubenswrapper[4747]: I1205 20:48:36.222890 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:48:36 crc kubenswrapper[4747]: I1205 20:48:36.223752 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:48:45 crc kubenswrapper[4747]: I1205 20:48:45.921350 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" podUID="da814908-e69b-476a-a5e8-7f128bb627b2" containerName="registry" containerID="cri-o://59b51efd897a34121c8b300b761364472982eff6e19f3fc7781c79b9c725953b" gracePeriod=30 Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.049313 4747 generic.go:334] "Generic (PLEG): container finished" podID="da814908-e69b-476a-a5e8-7f128bb627b2" containerID="59b51efd897a34121c8b300b761364472982eff6e19f3fc7781c79b9c725953b" exitCode=0 Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.049374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" event={"ID":"da814908-e69b-476a-a5e8-7f128bb627b2","Type":"ContainerDied","Data":"59b51efd897a34121c8b300b761364472982eff6e19f3fc7781c79b9c725953b"} Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.324983 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.431809 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.431858 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.431881 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.431921 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.431962 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tbt7\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.432012 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.432212 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.432241 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca\") pod \"da814908-e69b-476a-a5e8-7f128bb627b2\" (UID: \"da814908-e69b-476a-a5e8-7f128bb627b2\") " Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.432993 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.434349 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.438810 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.442843 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7" (OuterVolumeSpecName: "kube-api-access-8tbt7") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "kube-api-access-8tbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.442881 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.447646 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.454481 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.457261 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "da814908-e69b-476a-a5e8-7f128bb627b2" (UID: "da814908-e69b-476a-a5e8-7f128bb627b2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533459 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tbt7\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-kube-api-access-8tbt7\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533493 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533514 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da814908-e69b-476a-a5e8-7f128bb627b2-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533524 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da814908-e69b-476a-a5e8-7f128bb627b2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533533 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da814908-e69b-476a-a5e8-7f128bb627b2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533544 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:46 crc kubenswrapper[4747]: I1205 20:48:46.533552 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da814908-e69b-476a-a5e8-7f128bb627b2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.059063 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" event={"ID":"da814908-e69b-476a-a5e8-7f128bb627b2","Type":"ContainerDied","Data":"17d4c0557801b85f1d501daf24e7735a83ece84e7e8ee2134b62fe0bb7cf02da"} Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.059142 4747 scope.go:117] "RemoveContainer" containerID="59b51efd897a34121c8b300b761364472982eff6e19f3fc7781c79b9c725953b" Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.059328 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xxxxf" Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.102111 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.109150 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xxxxf"] Dec 05 20:48:47 crc kubenswrapper[4747]: I1205 20:48:47.847397 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da814908-e69b-476a-a5e8-7f128bb627b2" path="/var/lib/kubelet/pods/da814908-e69b-476a-a5e8-7f128bb627b2/volumes" Dec 05 20:49:06 crc kubenswrapper[4747]: I1205 20:49:06.221835 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:49:06 crc kubenswrapper[4747]: I1205 20:49:06.222438 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:49:06 crc kubenswrapper[4747]: I1205 20:49:06.222502 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:49:06 crc kubenswrapper[4747]: I1205 20:49:06.223237 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:49:06 crc kubenswrapper[4747]: I1205 20:49:06.223358 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea" gracePeriod=600 Dec 05 20:49:07 crc kubenswrapper[4747]: I1205 20:49:07.178510 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea" exitCode=0 Dec 05 20:49:07 crc kubenswrapper[4747]: I1205 20:49:07.178642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea"} Dec 05 20:49:07 crc kubenswrapper[4747]: I1205 20:49:07.179104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b"} Dec 05 20:49:07 crc kubenswrapper[4747]: I1205 20:49:07.179148 4747 scope.go:117] "RemoveContainer" containerID="b17ad3ad601d2232033f200d35c4af1697122e6b85ffee0fbafad902c702cfcd" Dec 05 20:51:06 crc kubenswrapper[4747]: I1205 20:51:06.222288 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:51:06 crc kubenswrapper[4747]: I1205 20:51:06.223194 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:51:36 crc kubenswrapper[4747]: I1205 20:51:36.222356 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:51:36 crc kubenswrapper[4747]: I1205 20:51:36.223212 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:52:06 crc kubenswrapper[4747]: I1205 20:52:06.222071 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:52:06 crc kubenswrapper[4747]: I1205 20:52:06.222761 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:52:06 crc kubenswrapper[4747]: I1205 20:52:06.222898 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:52:06 crc kubenswrapper[4747]: I1205 20:52:06.223846 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:52:06 crc kubenswrapper[4747]: I1205 20:52:06.223968 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b" gracePeriod=600 Dec 05 20:52:07 crc kubenswrapper[4747]: I1205 20:52:07.334133 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b" exitCode=0 Dec 05 20:52:07 crc kubenswrapper[4747]: I1205 20:52:07.334217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b"} Dec 05 20:52:07 crc kubenswrapper[4747]: I1205 20:52:07.335071 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d"} Dec 05 20:52:07 crc kubenswrapper[4747]: I1205 20:52:07.335108 4747 scope.go:117] "RemoveContainer" containerID="12e50c23a537be5927e47622f92414124530418f175ea6c0995459f9b65026ea" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.595135 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kf4wd"] Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596155 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-controller" containerID="cri-o://83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596218 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="sbdb" containerID="cri-o://9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596301 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="nbdb" containerID="cri-o://485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596279 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-node" containerID="cri-o://513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596354 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="northd" containerID="cri-o://e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596233 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.596284 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-acl-logging" containerID="cri-o://effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.640504 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" containerID="cri-o://200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab" gracePeriod=30 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.833355 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/2.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.834319 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/1.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.834364 4747 generic.go:334] "Generic (PLEG): container finished" podID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" containerID="68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355" exitCode=2 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.834429 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerDied","Data":"68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.834473 4747 scope.go:117] "RemoveContainer" containerID="a87e5fe95941f4d6b97fdfa08ba6fcbfb2caca21d727ec4777f0e288c9797cd5" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.835019 4747 scope.go:117] "RemoveContainer" containerID="68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.835246 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nm4fd_openshift-multus(53f1e522-a732-4821-b7b0-6f1b6670c1d4)\"" pod="openshift-multus/multus-nm4fd" podUID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.839319 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovnkube-controller/3.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.841996 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-acl-logging/0.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.842455 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-controller/0.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843718 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab" exitCode=0 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843754 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027" exitCode=0 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843763 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317" exitCode=0 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843773 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece" exitCode=0 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843782 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf" exitCode=0 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843790 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0" exitCode=143 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843799 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb" exitCode=143 Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843937 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.843989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.844006 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.844029 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.844048 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb"} Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.895466 4747 scope.go:117] "RemoveContainer" containerID="dd7d05384fe0bfefedb945df13f483fc5367a892cfff74488f7eb9ffc7116a7b" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.932908 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-acl-logging/0.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.933355 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-controller/0.log" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.933849 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992313 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wjvxp"] Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992617 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992632 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992641 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kubecfg-setup" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992648 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kubecfg-setup" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992655 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="northd" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992662 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="northd" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992677 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="sbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992683 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="sbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992711 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992718 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992730 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992737 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992744 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da814908-e69b-476a-a5e8-7f128bb627b2" containerName="registry" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992750 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="da814908-e69b-476a-a5e8-7f128bb627b2" containerName="registry" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992759 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992765 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992793 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992800 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992808 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-acl-logging" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992815 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-acl-logging" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992824 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-node" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992832 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-node" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992843 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="nbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992864 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="nbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.992870 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992877 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.992998 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="sbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993029 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="da814908-e69b-476a-a5e8-7f128bb627b2" containerName="registry" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993038 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="nbdb" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993048 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993056 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-acl-logging" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993065 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-node" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993073 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993080 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993103 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="northd" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993111 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovn-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993119 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: E1205 20:53:22.993230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.993238 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.994026 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.994038 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881e707-c00c-4e49-8e30-a17719e80915" containerName="ovnkube-controller" Dec 05 20:53:22 crc kubenswrapper[4747]: I1205 20:53:22.996434 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.056969 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057058 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057081 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057114 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057150 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057170 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057199 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057220 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057244 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057276 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057321 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057360 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057359 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log" (OuterVolumeSpecName: "node-log") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057386 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057416 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057429 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057445 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket" (OuterVolumeSpecName: "log-socket") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057452 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057467 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057400 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057679 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057709 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd6rn\" (UniqueName: \"kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057862 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057477 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058022 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash" (OuterVolumeSpecName: "host-slash") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057497 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057895 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057994 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.057883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058070 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058098 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058422 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058463 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch\") pod \"4881e707-c00c-4e49-8e30-a17719e80915\" (UID: \"4881e707-c00c-4e49-8e30-a17719e80915\") " Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058495 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058629 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-node-log\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058697 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-etc-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-script-lib\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058734 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058825 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-log-socket\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058874 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovn-node-metrics-cert\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-config\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.058992 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-var-lib-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-slash\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-env-overrides\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059113 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-systemd-units\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059147 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059175 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-netns\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059195 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-netd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059226 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-kubelet\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059251 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-systemd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059275 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059312 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-bin\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059339 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-ovn\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059360 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsrq\" (UniqueName: \"kubernetes.io/projected/ed902baa-d8a4-4748-bc31-da18a15e44f1-kube-api-access-trsrq\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059424 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059437 4747 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059450 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059463 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059475 4747 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059489 4747 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059503 4747 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059515 4747 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059528 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059538 4747 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059550 4747 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059562 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.059574 4747 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.062722 4747 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.062815 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4881e707-c00c-4e49-8e30-a17719e80915-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.062856 4747 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.062879 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.064119 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.066230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn" (OuterVolumeSpecName: "kube-api-access-vd6rn") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "kube-api-access-vd6rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.074811 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4881e707-c00c-4e49-8e30-a17719e80915" (UID: "4881e707-c00c-4e49-8e30-a17719e80915"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177565 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-systemd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177631 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177663 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-bin\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-ovn\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trsrq\" (UniqueName: \"kubernetes.io/projected/ed902baa-d8a4-4748-bc31-da18a15e44f1-kube-api-access-trsrq\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177735 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-node-log\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177760 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-etc-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177780 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-script-lib\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177801 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177822 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-log-socket\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177847 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovn-node-metrics-cert\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-config\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-var-lib-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-slash\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-env-overrides\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.177988 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-systemd-units\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178033 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-netns\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-netd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178076 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-kubelet\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178118 4747 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4881e707-c00c-4e49-8e30-a17719e80915-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178132 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd6rn\" (UniqueName: \"kubernetes.io/projected/4881e707-c00c-4e49-8e30-a17719e80915-kube-api-access-vd6rn\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178144 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4881e707-c00c-4e49-8e30-a17719e80915-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178186 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-kubelet\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178228 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-systemd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178255 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178281 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-bin\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178308 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-ovn\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-node-log\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.178730 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-etc-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.179796 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-script-lib\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.179849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-run-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.179883 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-log-socket\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181000 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-env-overrides\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181449 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovnkube-config\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-var-lib-openvswitch\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-slash\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-systemd-units\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181933 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-run-netns\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.181954 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed902baa-d8a4-4748-bc31-da18a15e44f1-host-cni-netd\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.189304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed902baa-d8a4-4748-bc31-da18a15e44f1-ovn-node-metrics-cert\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.207354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsrq\" (UniqueName: \"kubernetes.io/projected/ed902baa-d8a4-4748-bc31-da18a15e44f1-kube-api-access-trsrq\") pod \"ovnkube-node-wjvxp\" (UID: \"ed902baa-d8a4-4748-bc31-da18a15e44f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.309244 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.856280 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-acl-logging/0.log" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.856870 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf4wd_4881e707-c00c-4e49-8e30-a17719e80915/ovn-controller/0.log" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.857498 4747 generic.go:334] "Generic (PLEG): container finished" podID="4881e707-c00c-4e49-8e30-a17719e80915" containerID="e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198" exitCode=0 Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.858084 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.860394 4747 generic.go:334] "Generic (PLEG): container finished" podID="ed902baa-d8a4-4748-bc31-da18a15e44f1" containerID="cb365caaacb09bb066b2f5ea9d35979d96031944fe0ad4ed2d513b7bd037651d" exitCode=0 Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.860618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198"} Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.860778 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf4wd" event={"ID":"4881e707-c00c-4e49-8e30-a17719e80915","Type":"ContainerDied","Data":"7322703cc3c0fca5940d7da14e76bfd403fef39c1a8163aa44cfb4499972ec41"} Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.862035 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerDied","Data":"cb365caaacb09bb066b2f5ea9d35979d96031944fe0ad4ed2d513b7bd037651d"} Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.862215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"1339607fd896f4b42d46b1db7a521dec84c11b4bf91f9707d1964db3bec4ecd8"} Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.860826 4747 scope.go:117] "RemoveContainer" containerID="200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.865273 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/2.log" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.891038 4747 scope.go:117] "RemoveContainer" containerID="9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.911870 4747 scope.go:117] "RemoveContainer" containerID="485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.923716 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kf4wd"] Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.927317 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kf4wd"] Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.945470 4747 scope.go:117] "RemoveContainer" containerID="e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198" Dec 05 20:53:23 crc kubenswrapper[4747]: I1205 20:53:23.983994 4747 scope.go:117] "RemoveContainer" containerID="9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.001643 4747 scope.go:117] "RemoveContainer" containerID="513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.013952 4747 scope.go:117] "RemoveContainer" containerID="effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.031191 4747 scope.go:117] "RemoveContainer" containerID="83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.049084 4747 scope.go:117] "RemoveContainer" containerID="496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.079274 4747 scope.go:117] "RemoveContainer" containerID="200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.079661 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab\": container with ID starting with 200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab not found: ID does not exist" containerID="200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.079736 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab"} err="failed to get container status \"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab\": rpc error: code = NotFound desc = could not find container \"200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab\": container with ID starting with 200fa1a5e1618c9bcb5bd2cc0fd00f7b6fe607d20dc0c625cd3747514d0c0dab not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.079784 4747 scope.go:117] "RemoveContainer" containerID="9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.080127 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\": container with ID starting with 9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027 not found: ID does not exist" containerID="9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080168 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027"} err="failed to get container status \"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\": rpc error: code = NotFound desc = could not find container \"9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027\": container with ID starting with 9af8fb83803c2bcba617ea9ac3820163b335e3dc870262ef9efc46e5669c9027 not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080196 4747 scope.go:117] "RemoveContainer" containerID="485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.080469 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\": container with ID starting with 485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317 not found: ID does not exist" containerID="485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080498 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317"} err="failed to get container status \"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\": rpc error: code = NotFound desc = could not find container \"485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317\": container with ID starting with 485a444010b5765dadcb4698b515eeaa653f37d60485a8ab71ead79d7a164317 not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080517 4747 scope.go:117] "RemoveContainer" containerID="e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.080934 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\": container with ID starting with e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198 not found: ID does not exist" containerID="e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080969 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198"} err="failed to get container status \"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\": rpc error: code = NotFound desc = could not find container \"e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198\": container with ID starting with e6f8fd7c07e056d6ec2d05f4496230f3543c437855a15960ac19fe4f738b3198 not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.080991 4747 scope.go:117] "RemoveContainer" containerID="9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.081336 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\": container with ID starting with 9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece not found: ID does not exist" containerID="9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081362 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece"} err="failed to get container status \"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\": rpc error: code = NotFound desc = could not find container \"9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece\": container with ID starting with 9231c33b5891cf2aee0b1abed04e76f3cf794dbff5bb7b7d3c2d19fab7f3eece not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081376 4747 scope.go:117] "RemoveContainer" containerID="513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.081613 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\": container with ID starting with 513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf not found: ID does not exist" containerID="513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081633 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf"} err="failed to get container status \"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\": rpc error: code = NotFound desc = could not find container \"513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf\": container with ID starting with 513a60af231ca4fe6939fec6b83b1bf96dab15e6a3ff6a838a064f58c4c27fcf not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081644 4747 scope.go:117] "RemoveContainer" containerID="effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.081845 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\": container with ID starting with effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0 not found: ID does not exist" containerID="effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081897 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0"} err="failed to get container status \"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\": rpc error: code = NotFound desc = could not find container \"effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0\": container with ID starting with effaf3b1aa41fa0b3fdfd7b005aa9ed3c00ae283a65ed3c9ca0eef7e815859c0 not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.081926 4747 scope.go:117] "RemoveContainer" containerID="83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.082169 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\": container with ID starting with 83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb not found: ID does not exist" containerID="83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.082200 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb"} err="failed to get container status \"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\": rpc error: code = NotFound desc = could not find container \"83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb\": container with ID starting with 83cadaafe255ba57a88fa6f8d9c3f12155bb7f89e1bb5dec7b2cc2d99bc27bbb not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.082219 4747 scope.go:117] "RemoveContainer" containerID="496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8" Dec 05 20:53:24 crc kubenswrapper[4747]: E1205 20:53:24.082404 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\": container with ID starting with 496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8 not found: ID does not exist" containerID="496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.082430 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8"} err="failed to get container status \"496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\": rpc error: code = NotFound desc = could not find container \"496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8\": container with ID starting with 496324112a682e061c2d197d03994e9c7e4c08ded6252419e3d5b8c3a41352f8 not found: ID does not exist" Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"0b21e2e1eb8c2e193c457150b55ed41db4d1072bcd4df1b976aa2e1387dcc702"} Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"fef16d27455c17d7ab395f1d551f498d7f474e2ccebfb2d924d163dda1f53650"} Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"068172e6c6f11d402867800672c8224749a828f99fbfdcbb1dd99ad5a3974287"} Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881902 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"db68ee88cfaaf31e646bad2bfcfea3522ff1dc57f074ad807f7a5dac009d68b7"} Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"066f14ca54bc4e8bbfb2fcdc563a57c7ccb842980881ad5c04c8ed7c5006aef1"} Dec 05 20:53:24 crc kubenswrapper[4747]: I1205 20:53:24.881932 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"2428508069f9b128c1daf84b7ed51d2f5638bff943b89d3e8cab83e5a1053232"} Dec 05 20:53:25 crc kubenswrapper[4747]: I1205 20:53:25.934230 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4881e707-c00c-4e49-8e30-a17719e80915" path="/var/lib/kubelet/pods/4881e707-c00c-4e49-8e30-a17719e80915/volumes" Dec 05 20:53:26 crc kubenswrapper[4747]: I1205 20:53:26.925720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"f96aa82f7dbbd2ef43fea0cc3213ac38b115be7c976af4ae5ce28d669779ded8"} Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.748251 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jl7s4"] Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.749191 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.752204 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.753357 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.753375 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.753512 4747 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9vgs7" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.840759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkg5\" (UniqueName: \"kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.840955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.841068 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.942179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkg5\" (UniqueName: \"kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.942252 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.942302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.942641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.944507 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:28 crc kubenswrapper[4747]: I1205 20:53:28.969459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkg5\" (UniqueName: \"kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5\") pod \"crc-storage-crc-jl7s4\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.077971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.102575 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(a8c6b764f1c05c10ea0e73451b82c2dee57cc3429976749d1579a2aec86b8961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.102700 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(a8c6b764f1c05c10ea0e73451b82c2dee57cc3429976749d1579a2aec86b8961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.102746 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(a8c6b764f1c05c10ea0e73451b82c2dee57cc3429976749d1579a2aec86b8961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.102830 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(a8c6b764f1c05c10ea0e73451b82c2dee57cc3429976749d1579a2aec86b8961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-jl7s4" podUID="bd295402-3065-434f-893c-cea3109f6bad" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.871119 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jl7s4"] Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.951013 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.951018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" event={"ID":"ed902baa-d8a4-4748-bc31-da18a15e44f1","Type":"ContainerStarted","Data":"4e682dffa731ad1263ad48f9daeac86cd6acb7e88fe4719a954d738c2cf6b3fa"} Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.951528 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.951622 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.951687 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.952353 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.982868 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(bdf04b13c6f42392021b2b5e2d355565191d4e0dd3c54e18df943f9e1393fa85): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.982948 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(bdf04b13c6f42392021b2b5e2d355565191d4e0dd3c54e18df943f9e1393fa85): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.982982 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(bdf04b13c6f42392021b2b5e2d355565191d4e0dd3c54e18df943f9e1393fa85): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:29 crc kubenswrapper[4747]: E1205 20:53:29.983042 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(bdf04b13c6f42392021b2b5e2d355565191d4e0dd3c54e18df943f9e1393fa85): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-jl7s4" podUID="bd295402-3065-434f-893c-cea3109f6bad" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.987951 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:29 crc kubenswrapper[4747]: I1205 20:53:29.989459 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:30 crc kubenswrapper[4747]: I1205 20:53:30.021401 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" podStartSLOduration=8.021382841 podStartE2EDuration="8.021382841s" podCreationTimestamp="2025-12-05 20:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:53:29.98066779 +0000 UTC m=+680.447975328" watchObservedRunningTime="2025-12-05 20:53:30.021382841 +0000 UTC m=+680.488690319" Dec 05 20:53:36 crc kubenswrapper[4747]: I1205 20:53:36.840140 4747 scope.go:117] "RemoveContainer" containerID="68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355" Dec 05 20:53:36 crc kubenswrapper[4747]: E1205 20:53:36.841235 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nm4fd_openshift-multus(53f1e522-a732-4821-b7b0-6f1b6670c1d4)\"" pod="openshift-multus/multus-nm4fd" podUID="53f1e522-a732-4821-b7b0-6f1b6670c1d4" Dec 05 20:53:40 crc kubenswrapper[4747]: I1205 20:53:40.838994 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:40 crc kubenswrapper[4747]: I1205 20:53:40.840021 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:40 crc kubenswrapper[4747]: E1205 20:53:40.884349 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(2e9e3914276dd82bd59a43e54ca94f8acca9591f76fca17b1d8bfe57b70adbe1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:53:40 crc kubenswrapper[4747]: E1205 20:53:40.884443 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(2e9e3914276dd82bd59a43e54ca94f8acca9591f76fca17b1d8bfe57b70adbe1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:40 crc kubenswrapper[4747]: E1205 20:53:40.884482 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(2e9e3914276dd82bd59a43e54ca94f8acca9591f76fca17b1d8bfe57b70adbe1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:40 crc kubenswrapper[4747]: E1205 20:53:40.884558 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-jl7s4_crc-storage(bd295402-3065-434f-893c-cea3109f6bad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jl7s4_crc-storage_bd295402-3065-434f-893c-cea3109f6bad_0(2e9e3914276dd82bd59a43e54ca94f8acca9591f76fca17b1d8bfe57b70adbe1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-jl7s4" podUID="bd295402-3065-434f-893c-cea3109f6bad" Dec 05 20:53:47 crc kubenswrapper[4747]: I1205 20:53:47.839944 4747 scope.go:117] "RemoveContainer" containerID="68d5e1b3e9e29f79534bdc6c95a06fd1691c3e8318cb8a47d6898f2b2d4a9355" Dec 05 20:53:48 crc kubenswrapper[4747]: I1205 20:53:48.090228 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nm4fd_53f1e522-a732-4821-b7b0-6f1b6670c1d4/kube-multus/2.log" Dec 05 20:53:48 crc kubenswrapper[4747]: I1205 20:53:48.090520 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nm4fd" event={"ID":"53f1e522-a732-4821-b7b0-6f1b6670c1d4","Type":"ContainerStarted","Data":"85008d0e63cf35213b8c218be9124622840f4c9e15e2b62e400fc58ce6f9d235"} Dec 05 20:53:53 crc kubenswrapper[4747]: I1205 20:53:53.347992 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjvxp" Dec 05 20:53:55 crc kubenswrapper[4747]: I1205 20:53:55.838887 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:55 crc kubenswrapper[4747]: I1205 20:53:55.839652 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:56 crc kubenswrapper[4747]: I1205 20:53:56.104840 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jl7s4"] Dec 05 20:53:56 crc kubenswrapper[4747]: I1205 20:53:56.114034 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:53:56 crc kubenswrapper[4747]: I1205 20:53:56.140987 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jl7s4" event={"ID":"bd295402-3065-434f-893c-cea3109f6bad","Type":"ContainerStarted","Data":"c376d77f5eabe880f1d2dbf5d791a49995543a98556ec8933024d7de30e45dc1"} Dec 05 20:53:58 crc kubenswrapper[4747]: I1205 20:53:58.154880 4747 generic.go:334] "Generic (PLEG): container finished" podID="bd295402-3065-434f-893c-cea3109f6bad" containerID="74c4da903deb0ea2982269fd4168103a3fe73acf94a8c74cffffde739a964b9b" exitCode=0 Dec 05 20:53:58 crc kubenswrapper[4747]: I1205 20:53:58.155232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jl7s4" event={"ID":"bd295402-3065-434f-893c-cea3109f6bad","Type":"ContainerDied","Data":"74c4da903deb0ea2982269fd4168103a3fe73acf94a8c74cffffde739a964b9b"} Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.398544 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.426550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage\") pod \"bd295402-3065-434f-893c-cea3109f6bad\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.426637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt\") pod \"bd295402-3065-434f-893c-cea3109f6bad\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.426672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgkg5\" (UniqueName: \"kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5\") pod \"bd295402-3065-434f-893c-cea3109f6bad\" (UID: \"bd295402-3065-434f-893c-cea3109f6bad\") " Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.426837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "bd295402-3065-434f-893c-cea3109f6bad" (UID: "bd295402-3065-434f-893c-cea3109f6bad"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.428027 4747 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bd295402-3065-434f-893c-cea3109f6bad-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.431454 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5" (OuterVolumeSpecName: "kube-api-access-vgkg5") pod "bd295402-3065-434f-893c-cea3109f6bad" (UID: "bd295402-3065-434f-893c-cea3109f6bad"). InnerVolumeSpecName "kube-api-access-vgkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.440131 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "bd295402-3065-434f-893c-cea3109f6bad" (UID: "bd295402-3065-434f-893c-cea3109f6bad"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.530684 4747 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bd295402-3065-434f-893c-cea3109f6bad-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 20:53:59 crc kubenswrapper[4747]: I1205 20:53:59.530734 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgkg5\" (UniqueName: \"kubernetes.io/projected/bd295402-3065-434f-893c-cea3109f6bad-kube-api-access-vgkg5\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:00 crc kubenswrapper[4747]: I1205 20:54:00.168749 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jl7s4" event={"ID":"bd295402-3065-434f-893c-cea3109f6bad","Type":"ContainerDied","Data":"c376d77f5eabe880f1d2dbf5d791a49995543a98556ec8933024d7de30e45dc1"} Dec 05 20:54:00 crc kubenswrapper[4747]: I1205 20:54:00.168810 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c376d77f5eabe880f1d2dbf5d791a49995543a98556ec8933024d7de30e45dc1" Dec 05 20:54:00 crc kubenswrapper[4747]: I1205 20:54:00.168834 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jl7s4" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.221780 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.222369 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.244939 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6"] Dec 05 20:54:06 crc kubenswrapper[4747]: E1205 20:54:06.245290 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd295402-3065-434f-893c-cea3109f6bad" containerName="storage" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.245318 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd295402-3065-434f-893c-cea3109f6bad" containerName="storage" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.245461 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd295402-3065-434f-893c-cea3109f6bad" containerName="storage" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.246509 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.248638 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.258071 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6"] Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.328620 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjl4t\" (UniqueName: \"kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.328684 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.328785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.430110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.430182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjl4t\" (UniqueName: \"kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.430205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.431078 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.431671 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.452975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjl4t\" (UniqueName: \"kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.621007 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:06 crc kubenswrapper[4747]: I1205 20:54:06.819112 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6"] Dec 05 20:54:07 crc kubenswrapper[4747]: I1205 20:54:07.214466 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerStarted","Data":"35c7f232ff6caf7e674b8cfa002eb86025daf06ed8665df1d1570b1b84e61dbf"} Dec 05 20:54:07 crc kubenswrapper[4747]: I1205 20:54:07.214519 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerStarted","Data":"f5699586034719351a927a1c10bd22df28b40e39ed3915fa0c850cc92c2b796c"} Dec 05 20:54:08 crc kubenswrapper[4747]: I1205 20:54:08.221937 4747 generic.go:334] "Generic (PLEG): container finished" podID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerID="35c7f232ff6caf7e674b8cfa002eb86025daf06ed8665df1d1570b1b84e61dbf" exitCode=0 Dec 05 20:54:08 crc kubenswrapper[4747]: I1205 20:54:08.221998 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerDied","Data":"35c7f232ff6caf7e674b8cfa002eb86025daf06ed8665df1d1570b1b84e61dbf"} Dec 05 20:54:10 crc kubenswrapper[4747]: I1205 20:54:10.234697 4747 generic.go:334] "Generic (PLEG): container finished" podID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerID="bf37b2ebe058efa5d216e3f22af23ac78988b6f2dcedbe94cfa87246ab289eea" exitCode=0 Dec 05 20:54:10 crc kubenswrapper[4747]: I1205 20:54:10.234739 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerDied","Data":"bf37b2ebe058efa5d216e3f22af23ac78988b6f2dcedbe94cfa87246ab289eea"} Dec 05 20:54:11 crc kubenswrapper[4747]: I1205 20:54:11.243544 4747 generic.go:334] "Generic (PLEG): container finished" podID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerID="f3dcf3bea3083a2198efe04784b3c8cc32dbdb328a48685574e6cbec21f60c25" exitCode=0 Dec 05 20:54:11 crc kubenswrapper[4747]: I1205 20:54:11.243613 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerDied","Data":"f3dcf3bea3083a2198efe04784b3c8cc32dbdb328a48685574e6cbec21f60c25"} Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.471222 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.606689 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util\") pod \"4d8159d7-7e1b-4c53-bd04-d806c4165588\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.607237 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjl4t\" (UniqueName: \"kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t\") pod \"4d8159d7-7e1b-4c53-bd04-d806c4165588\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.607516 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle\") pod \"4d8159d7-7e1b-4c53-bd04-d806c4165588\" (UID: \"4d8159d7-7e1b-4c53-bd04-d806c4165588\") " Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.608337 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle" (OuterVolumeSpecName: "bundle") pod "4d8159d7-7e1b-4c53-bd04-d806c4165588" (UID: "4d8159d7-7e1b-4c53-bd04-d806c4165588"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.618394 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util" (OuterVolumeSpecName: "util") pod "4d8159d7-7e1b-4c53-bd04-d806c4165588" (UID: "4d8159d7-7e1b-4c53-bd04-d806c4165588"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.620782 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t" (OuterVolumeSpecName: "kube-api-access-xjl4t") pod "4d8159d7-7e1b-4c53-bd04-d806c4165588" (UID: "4d8159d7-7e1b-4c53-bd04-d806c4165588"). InnerVolumeSpecName "kube-api-access-xjl4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.709198 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.709237 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d8159d7-7e1b-4c53-bd04-d806c4165588-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:12 crc kubenswrapper[4747]: I1205 20:54:12.709248 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjl4t\" (UniqueName: \"kubernetes.io/projected/4d8159d7-7e1b-4c53-bd04-d806c4165588-kube-api-access-xjl4t\") on node \"crc\" DevicePath \"\"" Dec 05 20:54:13 crc kubenswrapper[4747]: I1205 20:54:13.261170 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" event={"ID":"4d8159d7-7e1b-4c53-bd04-d806c4165588","Type":"ContainerDied","Data":"f5699586034719351a927a1c10bd22df28b40e39ed3915fa0c850cc92c2b796c"} Dec 05 20:54:13 crc kubenswrapper[4747]: I1205 20:54:13.262251 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5699586034719351a927a1c10bd22df28b40e39ed3915fa0c850cc92c2b796c" Dec 05 20:54:13 crc kubenswrapper[4747]: I1205 20:54:13.261323 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.820420 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7"] Dec 05 20:54:17 crc kubenswrapper[4747]: E1205 20:54:17.821487 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="util" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.821507 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="util" Dec 05 20:54:17 crc kubenswrapper[4747]: E1205 20:54:17.821523 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="extract" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.821531 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="extract" Dec 05 20:54:17 crc kubenswrapper[4747]: E1205 20:54:17.821550 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="pull" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.821558 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="pull" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.821705 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8159d7-7e1b-4c53-bd04-d806c4165588" containerName="extract" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.822331 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.835838 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.836059 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.847044 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9wtrr" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.855741 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7"] Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.879213 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkx2\" (UniqueName: \"kubernetes.io/projected/831590ed-85ad-424e-a4b7-83d97c56265c-kube-api-access-bjkx2\") pod \"nmstate-operator-5b5b58f5c8-nt4x7\" (UID: \"831590ed-85ad-424e-a4b7-83d97c56265c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" Dec 05 20:54:17 crc kubenswrapper[4747]: I1205 20:54:17.981604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkx2\" (UniqueName: \"kubernetes.io/projected/831590ed-85ad-424e-a4b7-83d97c56265c-kube-api-access-bjkx2\") pod \"nmstate-operator-5b5b58f5c8-nt4x7\" (UID: \"831590ed-85ad-424e-a4b7-83d97c56265c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" Dec 05 20:54:18 crc kubenswrapper[4747]: I1205 20:54:18.004366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkx2\" (UniqueName: \"kubernetes.io/projected/831590ed-85ad-424e-a4b7-83d97c56265c-kube-api-access-bjkx2\") pod \"nmstate-operator-5b5b58f5c8-nt4x7\" (UID: \"831590ed-85ad-424e-a4b7-83d97c56265c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" Dec 05 20:54:18 crc kubenswrapper[4747]: I1205 20:54:18.155735 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" Dec 05 20:54:18 crc kubenswrapper[4747]: I1205 20:54:18.591541 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7"] Dec 05 20:54:19 crc kubenswrapper[4747]: I1205 20:54:19.298179 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" event={"ID":"831590ed-85ad-424e-a4b7-83d97c56265c","Type":"ContainerStarted","Data":"dd27d894ecbbc7ebe0e6ca681ae62120e524bdbe2bd352a84cbdda6ce52bb9e2"} Dec 05 20:54:21 crc kubenswrapper[4747]: I1205 20:54:21.312146 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" event={"ID":"831590ed-85ad-424e-a4b7-83d97c56265c","Type":"ContainerStarted","Data":"0a772a927b892bc7df572652a6308f0222e0c097adf0a147b9d6e8b3c9c2a4d4"} Dec 05 20:54:21 crc kubenswrapper[4747]: I1205 20:54:21.331735 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nt4x7" podStartSLOduration=2.253714343 podStartE2EDuration="4.331711349s" podCreationTimestamp="2025-12-05 20:54:17 +0000 UTC" firstStartedPulling="2025-12-05 20:54:18.603415122 +0000 UTC m=+729.070722610" lastFinishedPulling="2025-12-05 20:54:20.681412118 +0000 UTC m=+731.148719616" observedRunningTime="2025-12-05 20:54:21.326537883 +0000 UTC m=+731.793845421" watchObservedRunningTime="2025-12-05 20:54:21.331711349 +0000 UTC m=+731.799018847" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.614417 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.618363 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.628562 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-kff77" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.639002 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.639808 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.641236 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.644657 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.651035 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.655615 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7tm7k"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.656446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbk7q\" (UniqueName: \"kubernetes.io/projected/bfa081f3-bb53-463e-8dc4-af51189a16c8-kube-api-access-fbk7q\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745383 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-nmstate-lock\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745473 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxmjj\" (UniqueName: \"kubernetes.io/projected/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-kube-api-access-vxmjj\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745513 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bfa081f3-bb53-463e-8dc4-af51189a16c8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745533 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-ovs-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcvz\" (UniqueName: \"kubernetes.io/projected/a6577b22-4d4b-49a1-9e24-ff749998eee5-kube-api-access-8lcvz\") pod \"nmstate-metrics-7f946cbc9-8n9rp\" (UID: \"a6577b22-4d4b-49a1-9e24-ff749998eee5\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.745738 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-dbus-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.780868 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.781476 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.782877 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cl9rg" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.783079 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.783248 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.829468 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847206 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbk7q\" (UniqueName: \"kubernetes.io/projected/bfa081f3-bb53-463e-8dc4-af51189a16c8-kube-api-access-fbk7q\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-nmstate-lock\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847279 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl87c\" (UniqueName: \"kubernetes.io/projected/4548c537-2784-461e-b042-2a7efed9ae3a-kube-api-access-jl87c\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847303 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4548c537-2784-461e-b042-2a7efed9ae3a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4548c537-2784-461e-b042-2a7efed9ae3a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847341 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxmjj\" (UniqueName: \"kubernetes.io/projected/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-kube-api-access-vxmjj\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bfa081f3-bb53-463e-8dc4-af51189a16c8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847640 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-ovs-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847667 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcvz\" (UniqueName: \"kubernetes.io/projected/a6577b22-4d4b-49a1-9e24-ff749998eee5-kube-api-access-8lcvz\") pod \"nmstate-metrics-7f946cbc9-8n9rp\" (UID: \"a6577b22-4d4b-49a1-9e24-ff749998eee5\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.847730 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-dbus-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.848093 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-dbus-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.848142 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-ovs-socket\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.848897 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-nmstate-lock\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.859751 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bfa081f3-bb53-463e-8dc4-af51189a16c8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.862418 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxmjj\" (UniqueName: \"kubernetes.io/projected/fed6a17f-26e6-4ad2-aa77-63e697e05f1f-kube-api-access-vxmjj\") pod \"nmstate-handler-7tm7k\" (UID: \"fed6a17f-26e6-4ad2-aa77-63e697e05f1f\") " pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.865009 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbk7q\" (UniqueName: \"kubernetes.io/projected/bfa081f3-bb53-463e-8dc4-af51189a16c8-kube-api-access-fbk7q\") pod \"nmstate-webhook-5f6d4c5ccb-mvgbw\" (UID: \"bfa081f3-bb53-463e-8dc4-af51189a16c8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.868225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcvz\" (UniqueName: \"kubernetes.io/projected/a6577b22-4d4b-49a1-9e24-ff749998eee5-kube-api-access-8lcvz\") pod \"nmstate-metrics-7f946cbc9-8n9rp\" (UID: \"a6577b22-4d4b-49a1-9e24-ff749998eee5\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.949020 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl87c\" (UniqueName: \"kubernetes.io/projected/4548c537-2784-461e-b042-2a7efed9ae3a-kube-api-access-jl87c\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.949077 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4548c537-2784-461e-b042-2a7efed9ae3a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.949102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4548c537-2784-461e-b042-2a7efed9ae3a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.950499 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4548c537-2784-461e-b042-2a7efed9ae3a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.953275 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4548c537-2784-461e-b042-2a7efed9ae3a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.954971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.962361 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.964573 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cd5957c45-vwz54"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.965391 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.973105 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd5957c45-vwz54"] Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.973241 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:26 crc kubenswrapper[4747]: I1205 20:54:26.981008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl87c\" (UniqueName: \"kubernetes.io/projected/4548c537-2784-461e-b042-2a7efed9ae3a-kube-api-access-jl87c\") pod \"nmstate-console-plugin-7fbb5f6569-hn2jh\" (UID: \"4548c537-2784-461e-b042-2a7efed9ae3a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051171 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-oauth-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051491 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-oauth-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051535 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051559 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051618 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-service-ca\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051643 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhr9n\" (UniqueName: \"kubernetes.io/projected/9c1480c2-769c-4a8b-8ab0-45c47be4732c-kube-api-access-lhr9n\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.051665 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-trusted-ca-bundle\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.096596 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153632 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-oauth-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153757 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153791 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-service-ca\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhr9n\" (UniqueName: \"kubernetes.io/projected/9c1480c2-769c-4a8b-8ab0-45c47be4732c-kube-api-access-lhr9n\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-trusted-ca-bundle\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.153899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-oauth-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.155760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-oauth-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.156181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.156557 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-service-ca\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.158048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1480c2-769c-4a8b-8ab0-45c47be4732c-trusted-ca-bundle\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.161217 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-serving-cert\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.161523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1480c2-769c-4a8b-8ab0-45c47be4732c-console-oauth-config\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.174486 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhr9n\" (UniqueName: \"kubernetes.io/projected/9c1480c2-769c-4a8b-8ab0-45c47be4732c-kube-api-access-lhr9n\") pod \"console-cd5957c45-vwz54\" (UID: \"9c1480c2-769c-4a8b-8ab0-45c47be4732c\") " pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.286371 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh"] Dec 05 20:54:27 crc kubenswrapper[4747]: W1205 20:54:27.292672 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4548c537_2784_461e_b042_2a7efed9ae3a.slice/crio-88110a1c5a3c9d8ebe8890f782dd10caf9e72ae6b726886df5a2fdc3b543f0e8 WatchSource:0}: Error finding container 88110a1c5a3c9d8ebe8890f782dd10caf9e72ae6b726886df5a2fdc3b543f0e8: Status 404 returned error can't find the container with id 88110a1c5a3c9d8ebe8890f782dd10caf9e72ae6b726886df5a2fdc3b543f0e8 Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.331468 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.349633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7tm7k" event={"ID":"fed6a17f-26e6-4ad2-aa77-63e697e05f1f","Type":"ContainerStarted","Data":"43094397468c1ba6d47b1b29a3cf5b0835f319dbda7b6a38a7f793d722a93949"} Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.351267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" event={"ID":"4548c537-2784-461e-b042-2a7efed9ae3a","Type":"ContainerStarted","Data":"88110a1c5a3c9d8ebe8890f782dd10caf9e72ae6b726886df5a2fdc3b543f0e8"} Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.382756 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp"] Dec 05 20:54:27 crc kubenswrapper[4747]: W1205 20:54:27.387302 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6577b22_4d4b_49a1_9e24_ff749998eee5.slice/crio-02d125a784824816e683b775bf189201bba3011e615d91b8c8b4935b4c2cff28 WatchSource:0}: Error finding container 02d125a784824816e683b775bf189201bba3011e615d91b8c8b4935b4c2cff28: Status 404 returned error can't find the container with id 02d125a784824816e683b775bf189201bba3011e615d91b8c8b4935b4c2cff28 Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.436061 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw"] Dec 05 20:54:27 crc kubenswrapper[4747]: W1205 20:54:27.444999 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa081f3_bb53_463e_8dc4_af51189a16c8.slice/crio-2610307a8c1946a6fe1ee0ed16832c5f343c749ebda543e20e1dbf15ad63f00c WatchSource:0}: Error finding container 2610307a8c1946a6fe1ee0ed16832c5f343c749ebda543e20e1dbf15ad63f00c: Status 404 returned error can't find the container with id 2610307a8c1946a6fe1ee0ed16832c5f343c749ebda543e20e1dbf15ad63f00c Dec 05 20:54:27 crc kubenswrapper[4747]: I1205 20:54:27.767147 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd5957c45-vwz54"] Dec 05 20:54:27 crc kubenswrapper[4747]: W1205 20:54:27.774252 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1480c2_769c_4a8b_8ab0_45c47be4732c.slice/crio-dc126a68a5442b845f30ce1d5ac2fbb8f00a95c1357cf52de2b72dcbbc61746e WatchSource:0}: Error finding container dc126a68a5442b845f30ce1d5ac2fbb8f00a95c1357cf52de2b72dcbbc61746e: Status 404 returned error can't find the container with id dc126a68a5442b845f30ce1d5ac2fbb8f00a95c1357cf52de2b72dcbbc61746e Dec 05 20:54:28 crc kubenswrapper[4747]: I1205 20:54:28.357908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" event={"ID":"a6577b22-4d4b-49a1-9e24-ff749998eee5","Type":"ContainerStarted","Data":"02d125a784824816e683b775bf189201bba3011e615d91b8c8b4935b4c2cff28"} Dec 05 20:54:28 crc kubenswrapper[4747]: I1205 20:54:28.358928 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" event={"ID":"bfa081f3-bb53-463e-8dc4-af51189a16c8","Type":"ContainerStarted","Data":"2610307a8c1946a6fe1ee0ed16832c5f343c749ebda543e20e1dbf15ad63f00c"} Dec 05 20:54:28 crc kubenswrapper[4747]: I1205 20:54:28.360994 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd5957c45-vwz54" event={"ID":"9c1480c2-769c-4a8b-8ab0-45c47be4732c","Type":"ContainerStarted","Data":"619848a60acf01922610bb2df546bb0ba992ed1b6dc794a1f58a604bd1133db5"} Dec 05 20:54:28 crc kubenswrapper[4747]: I1205 20:54:28.361020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd5957c45-vwz54" event={"ID":"9c1480c2-769c-4a8b-8ab0-45c47be4732c","Type":"ContainerStarted","Data":"dc126a68a5442b845f30ce1d5ac2fbb8f00a95c1357cf52de2b72dcbbc61746e"} Dec 05 20:54:28 crc kubenswrapper[4747]: I1205 20:54:28.382383 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cd5957c45-vwz54" podStartSLOduration=2.382360917 podStartE2EDuration="2.382360917s" podCreationTimestamp="2025-12-05 20:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:54:28.376795672 +0000 UTC m=+738.844103160" watchObservedRunningTime="2025-12-05 20:54:28.382360917 +0000 UTC m=+738.849668405" Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.373573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" event={"ID":"a6577b22-4d4b-49a1-9e24-ff749998eee5","Type":"ContainerStarted","Data":"2151f6217fc5ef5caa7045c77ec8fc06bcac72dbf8daf7625e6cd535cbb44640"} Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.375810 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" event={"ID":"bfa081f3-bb53-463e-8dc4-af51189a16c8","Type":"ContainerStarted","Data":"deed42f8ac4bd3042f3a2b4eba7eca706a4067a48ecd89ea51a9a1c9105a7e28"} Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.375985 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.377755 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7tm7k" event={"ID":"fed6a17f-26e6-4ad2-aa77-63e697e05f1f","Type":"ContainerStarted","Data":"99784013fee6d71de876293be9746d0fd6cc2c89774483282efae1e02b42cef3"} Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.378153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.381760 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" event={"ID":"4548c537-2784-461e-b042-2a7efed9ae3a","Type":"ContainerStarted","Data":"fa78f6d09321365b0fa024f17be8240f9225d46f27fcb4485426651fc41be45b"} Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.406757 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" podStartSLOduration=2.078539822 podStartE2EDuration="4.406730409s" podCreationTimestamp="2025-12-05 20:54:26 +0000 UTC" firstStartedPulling="2025-12-05 20:54:27.447269014 +0000 UTC m=+737.914576502" lastFinishedPulling="2025-12-05 20:54:29.775459601 +0000 UTC m=+740.242767089" observedRunningTime="2025-12-05 20:54:30.399299138 +0000 UTC m=+740.866606626" watchObservedRunningTime="2025-12-05 20:54:30.406730409 +0000 UTC m=+740.874037917" Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.428798 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-hn2jh" podStartSLOduration=1.9650440489999998 podStartE2EDuration="4.428774705s" podCreationTimestamp="2025-12-05 20:54:26 +0000 UTC" firstStartedPulling="2025-12-05 20:54:27.294572767 +0000 UTC m=+737.761880255" lastFinishedPulling="2025-12-05 20:54:29.758303423 +0000 UTC m=+740.225610911" observedRunningTime="2025-12-05 20:54:30.428771795 +0000 UTC m=+740.896079293" watchObservedRunningTime="2025-12-05 20:54:30.428774705 +0000 UTC m=+740.896082203" Dec 05 20:54:30 crc kubenswrapper[4747]: I1205 20:54:30.443388 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7tm7k" podStartSLOduration=1.690132027 podStartE2EDuration="4.443373391s" podCreationTimestamp="2025-12-05 20:54:26 +0000 UTC" firstStartedPulling="2025-12-05 20:54:27.022264008 +0000 UTC m=+737.489571496" lastFinishedPulling="2025-12-05 20:54:29.775505372 +0000 UTC m=+740.242812860" observedRunningTime="2025-12-05 20:54:30.443313949 +0000 UTC m=+740.910621447" watchObservedRunningTime="2025-12-05 20:54:30.443373391 +0000 UTC m=+740.910680869" Dec 05 20:54:33 crc kubenswrapper[4747]: I1205 20:54:33.406774 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" event={"ID":"a6577b22-4d4b-49a1-9e24-ff749998eee5","Type":"ContainerStarted","Data":"1cf9d7f97cb6bfef5a419c0adcd74b1d25f1a8445cac25fb927ae3cdaf24bea5"} Dec 05 20:54:33 crc kubenswrapper[4747]: I1205 20:54:33.425413 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8n9rp" podStartSLOduration=2.264032368 podStartE2EDuration="7.425389574s" podCreationTimestamp="2025-12-05 20:54:26 +0000 UTC" firstStartedPulling="2025-12-05 20:54:27.389282693 +0000 UTC m=+737.856590181" lastFinishedPulling="2025-12-05 20:54:32.550639879 +0000 UTC m=+743.017947387" observedRunningTime="2025-12-05 20:54:33.41989592 +0000 UTC m=+743.887203408" watchObservedRunningTime="2025-12-05 20:54:33.425389574 +0000 UTC m=+743.892697072" Dec 05 20:54:36 crc kubenswrapper[4747]: I1205 20:54:36.222700 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:54:36 crc kubenswrapper[4747]: I1205 20:54:36.223099 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.006305 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7tm7k" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.332015 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.332099 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.341872 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.442417 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cd5957c45-vwz54" Dec 05 20:54:37 crc kubenswrapper[4747]: I1205 20:54:37.517242 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:54:45 crc kubenswrapper[4747]: I1205 20:54:45.623872 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:54:46 crc kubenswrapper[4747]: I1205 20:54:46.973049 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mvgbw" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.389730 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt"] Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.392409 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.394675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.401882 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt"] Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.489358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.489719 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.489744 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfb74\" (UniqueName: \"kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.573360 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rb4b9" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" containerID="cri-o://59ec24865736b6d8fd6e2e2c4a3afcaef18bf1b988a1a839b10fe4dde9c8c776" gracePeriod=15 Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.591296 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.591439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.591505 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfb74\" (UniqueName: \"kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.591971 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.592083 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.623089 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfb74\" (UniqueName: \"kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.713219 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:02 crc kubenswrapper[4747]: I1205 20:55:02.945141 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt"] Dec 05 20:55:03 crc kubenswrapper[4747]: I1205 20:55:03.611976 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb4b9_dab4218f-b06c-473f-8882-5f207a79f403/console/0.log" Dec 05 20:55:03 crc kubenswrapper[4747]: I1205 20:55:03.612024 4747 generic.go:334] "Generic (PLEG): container finished" podID="dab4218f-b06c-473f-8882-5f207a79f403" containerID="59ec24865736b6d8fd6e2e2c4a3afcaef18bf1b988a1a839b10fe4dde9c8c776" exitCode=2 Dec 05 20:55:03 crc kubenswrapper[4747]: I1205 20:55:03.612104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb4b9" event={"ID":"dab4218f-b06c-473f-8882-5f207a79f403","Type":"ContainerDied","Data":"59ec24865736b6d8fd6e2e2c4a3afcaef18bf1b988a1a839b10fe4dde9c8c776"} Dec 05 20:55:03 crc kubenswrapper[4747]: I1205 20:55:03.613409 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" event={"ID":"05f0028b-0044-41f5-84a5-0116ec549d2f","Type":"ContainerStarted","Data":"46c99c0636440788612ebd73822e5ecc9d877cf987fbc863c7369d39425a4155"} Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.406373 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb4b9_dab4218f-b06c-473f-8882-5f207a79f403/console/0.log" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.406440 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426048 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66dbz\" (UniqueName: \"kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426136 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426177 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426225 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426253 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426324 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.426440 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle\") pod \"dab4218f-b06c-473f-8882-5f207a79f403\" (UID: \"dab4218f-b06c-473f-8882-5f207a79f403\") " Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.428174 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.428197 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca" (OuterVolumeSpecName: "service-ca") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.428613 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.430294 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config" (OuterVolumeSpecName: "console-config") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.435101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz" (OuterVolumeSpecName: "kube-api-access-66dbz") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "kube-api-access-66dbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.436761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.437995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dab4218f-b06c-473f-8882-5f207a79f403" (UID: "dab4218f-b06c-473f-8882-5f207a79f403"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528012 4747 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528070 4747 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528091 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528107 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528126 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66dbz\" (UniqueName: \"kubernetes.io/projected/dab4218f-b06c-473f-8882-5f207a79f403-kube-api-access-66dbz\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528143 4747 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab4218f-b06c-473f-8882-5f207a79f403-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.528159 4747 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab4218f-b06c-473f-8882-5f207a79f403-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.621345 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb4b9_dab4218f-b06c-473f-8882-5f207a79f403/console/0.log" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.621489 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb4b9" event={"ID":"dab4218f-b06c-473f-8882-5f207a79f403","Type":"ContainerDied","Data":"83f53625aca63a19341cca8751bf495cbc9a60d65f84cb3a6d1eeffbf8214a04"} Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.621531 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb4b9" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.621545 4747 scope.go:117] "RemoveContainer" containerID="59ec24865736b6d8fd6e2e2c4a3afcaef18bf1b988a1a839b10fe4dde9c8c776" Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.627365 4747 generic.go:334] "Generic (PLEG): container finished" podID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerID="866e1654e4bc7120d01570f56e4c3d99b0e9534e704764b12723b5c326b74253" exitCode=0 Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.627436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" event={"ID":"05f0028b-0044-41f5-84a5-0116ec549d2f","Type":"ContainerDied","Data":"866e1654e4bc7120d01570f56e4c3d99b0e9534e704764b12723b5c326b74253"} Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.687242 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:55:04 crc kubenswrapper[4747]: I1205 20:55:04.691389 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rb4b9"] Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.857329 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab4218f-b06c-473f-8882-5f207a79f403" path="/var/lib/kubelet/pods/dab4218f-b06c-473f-8882-5f207a79f403/volumes" Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.947075 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:05 crc kubenswrapper[4747]: E1205 20:55:05.947335 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.947354 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.947482 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab4218f-b06c-473f-8882-5f207a79f403" containerName="console" Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.948348 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:05 crc kubenswrapper[4747]: I1205 20:55:05.954945 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.054360 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwbd\" (UniqueName: \"kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.054452 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.054486 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.155375 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.155798 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwbd\" (UniqueName: \"kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.155845 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.155922 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.156161 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.175610 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwbd\" (UniqueName: \"kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd\") pod \"redhat-operators-crtgv\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.221876 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.221939 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.221981 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.222649 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.222704 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d" gracePeriod=600 Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.332654 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.644697 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d" exitCode=0 Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.644773 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d"} Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.645137 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70"} Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.645163 4747 scope.go:117] "RemoveContainer" containerID="370b9a3316ca0b85e9ac26caebee714f137ba55465083c839b97ac8f29dd1f9b" Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.647264 4747 generic.go:334] "Generic (PLEG): container finished" podID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerID="739256d0bf89ea02784335bd063ecbab524d84116b9942f8cd1a7e9d1ab7f9ad" exitCode=0 Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.647311 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" event={"ID":"05f0028b-0044-41f5-84a5-0116ec549d2f","Type":"ContainerDied","Data":"739256d0bf89ea02784335bd063ecbab524d84116b9942f8cd1a7e9d1ab7f9ad"} Dec 05 20:55:06 crc kubenswrapper[4747]: I1205 20:55:06.758640 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:06 crc kubenswrapper[4747]: W1205 20:55:06.767744 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18b64226_8359_4191_bd14_3b5489307a67.slice/crio-f33302b5456a35520fe039bc4d10bfe95b43e2e1a511c3dcbb4f984c9b87f2ad WatchSource:0}: Error finding container f33302b5456a35520fe039bc4d10bfe95b43e2e1a511c3dcbb4f984c9b87f2ad: Status 404 returned error can't find the container with id f33302b5456a35520fe039bc4d10bfe95b43e2e1a511c3dcbb4f984c9b87f2ad Dec 05 20:55:07 crc kubenswrapper[4747]: I1205 20:55:07.663220 4747 generic.go:334] "Generic (PLEG): container finished" podID="18b64226-8359-4191-bd14-3b5489307a67" containerID="3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916" exitCode=0 Dec 05 20:55:07 crc kubenswrapper[4747]: I1205 20:55:07.663322 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerDied","Data":"3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916"} Dec 05 20:55:07 crc kubenswrapper[4747]: I1205 20:55:07.663653 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerStarted","Data":"f33302b5456a35520fe039bc4d10bfe95b43e2e1a511c3dcbb4f984c9b87f2ad"} Dec 05 20:55:07 crc kubenswrapper[4747]: I1205 20:55:07.669951 4747 generic.go:334] "Generic (PLEG): container finished" podID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerID="b55f7461ea56c8bf6515613aba006ef6a782f689dda047f3d599fc3e48037c32" exitCode=0 Dec 05 20:55:07 crc kubenswrapper[4747]: I1205 20:55:07.669995 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" event={"ID":"05f0028b-0044-41f5-84a5-0116ec549d2f","Type":"ContainerDied","Data":"b55f7461ea56c8bf6515613aba006ef6a782f689dda047f3d599fc3e48037c32"} Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.681158 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerStarted","Data":"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0"} Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.954697 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.997966 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util\") pod \"05f0028b-0044-41f5-84a5-0116ec549d2f\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.998030 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfb74\" (UniqueName: \"kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74\") pod \"05f0028b-0044-41f5-84a5-0116ec549d2f\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.998132 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle\") pod \"05f0028b-0044-41f5-84a5-0116ec549d2f\" (UID: \"05f0028b-0044-41f5-84a5-0116ec549d2f\") " Dec 05 20:55:08 crc kubenswrapper[4747]: I1205 20:55:08.999521 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle" (OuterVolumeSpecName: "bundle") pod "05f0028b-0044-41f5-84a5-0116ec549d2f" (UID: "05f0028b-0044-41f5-84a5-0116ec549d2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.005154 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74" (OuterVolumeSpecName: "kube-api-access-bfb74") pod "05f0028b-0044-41f5-84a5-0116ec549d2f" (UID: "05f0028b-0044-41f5-84a5-0116ec549d2f"). InnerVolumeSpecName "kube-api-access-bfb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.014482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util" (OuterVolumeSpecName: "util") pod "05f0028b-0044-41f5-84a5-0116ec549d2f" (UID: "05f0028b-0044-41f5-84a5-0116ec549d2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.099976 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.100015 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfb74\" (UniqueName: \"kubernetes.io/projected/05f0028b-0044-41f5-84a5-0116ec549d2f-kube-api-access-bfb74\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.100031 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f0028b-0044-41f5-84a5-0116ec549d2f-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.691441 4747 generic.go:334] "Generic (PLEG): container finished" podID="18b64226-8359-4191-bd14-3b5489307a67" containerID="de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0" exitCode=0 Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.691625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerDied","Data":"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0"} Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.697887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" event={"ID":"05f0028b-0044-41f5-84a5-0116ec549d2f","Type":"ContainerDied","Data":"46c99c0636440788612ebd73822e5ecc9d877cf987fbc863c7369d39425a4155"} Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.698114 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt" Dec 05 20:55:09 crc kubenswrapper[4747]: I1205 20:55:09.698142 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c99c0636440788612ebd73822e5ecc9d877cf987fbc863c7369d39425a4155" Dec 05 20:55:10 crc kubenswrapper[4747]: I1205 20:55:10.708715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerStarted","Data":"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a"} Dec 05 20:55:16 crc kubenswrapper[4747]: I1205 20:55:16.333492 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:16 crc kubenswrapper[4747]: I1205 20:55:16.334122 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:16 crc kubenswrapper[4747]: I1205 20:55:16.388149 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:16 crc kubenswrapper[4747]: I1205 20:55:16.413750 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-crtgv" podStartSLOduration=8.983011601 podStartE2EDuration="11.413699552s" podCreationTimestamp="2025-12-05 20:55:05 +0000 UTC" firstStartedPulling="2025-12-05 20:55:07.66999801 +0000 UTC m=+778.137305538" lastFinishedPulling="2025-12-05 20:55:10.100685961 +0000 UTC m=+780.567993489" observedRunningTime="2025-12-05 20:55:10.740054173 +0000 UTC m=+781.207361701" watchObservedRunningTime="2025-12-05 20:55:16.413699552 +0000 UTC m=+786.881007040" Dec 05 20:55:16 crc kubenswrapper[4747]: I1205 20:55:16.788653 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:19 crc kubenswrapper[4747]: I1205 20:55:19.934682 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:19 crc kubenswrapper[4747]: I1205 20:55:19.935625 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-crtgv" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="registry-server" containerID="cri-o://f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a" gracePeriod=2 Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.925604 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq"] Dec 05 20:55:20 crc kubenswrapper[4747]: E1205 20:55:20.925854 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="util" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.925883 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="util" Dec 05 20:55:20 crc kubenswrapper[4747]: E1205 20:55:20.925898 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="extract" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.925904 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="extract" Dec 05 20:55:20 crc kubenswrapper[4747]: E1205 20:55:20.925913 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="pull" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.925919 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="pull" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.926050 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f0028b-0044-41f5-84a5-0116ec549d2f" containerName="extract" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.926540 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.930426 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zqj24" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.930449 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.930450 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.930852 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.931906 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.945171 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq"] Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.960284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqc2p\" (UniqueName: \"kubernetes.io/projected/a6646e53-609f-4dc3-b299-aae36ce927fc-kube-api-access-lqc2p\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.960715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-webhook-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:20 crc kubenswrapper[4747]: I1205 20:55:20.960852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-apiservice-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.061602 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqc2p\" (UniqueName: \"kubernetes.io/projected/a6646e53-609f-4dc3-b299-aae36ce927fc-kube-api-access-lqc2p\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.061669 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-webhook-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.061741 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-apiservice-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.068528 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-apiservice-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.070293 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6646e53-609f-4dc3-b299-aae36ce927fc-webhook-cert\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.087529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqc2p\" (UniqueName: \"kubernetes.io/projected/a6646e53-609f-4dc3-b299-aae36ce927fc-kube-api-access-lqc2p\") pod \"metallb-operator-controller-manager-5c77dc54f8-9hjvq\" (UID: \"a6646e53-609f-4dc3-b299-aae36ce927fc\") " pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.241739 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.367491 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f"] Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.368369 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.370568 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.370695 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dfqx9" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.370774 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.381903 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f"] Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.466459 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.466784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bvx\" (UniqueName: \"kubernetes.io/projected/2a21b504-6988-45fc-88b9-4108efc34b06-kube-api-access-h8bvx\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.466821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-webhook-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.568411 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.568488 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bvx\" (UniqueName: \"kubernetes.io/projected/2a21b504-6988-45fc-88b9-4108efc34b06-kube-api-access-h8bvx\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.568547 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-webhook-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.576522 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.595306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a21b504-6988-45fc-88b9-4108efc34b06-webhook-cert\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.608769 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bvx\" (UniqueName: \"kubernetes.io/projected/2a21b504-6988-45fc-88b9-4108efc34b06-kube-api-access-h8bvx\") pod \"metallb-operator-webhook-server-7d5d4d9899-rfr7f\" (UID: \"2a21b504-6988-45fc-88b9-4108efc34b06\") " pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.659975 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.741394 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.770239 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nwbd\" (UniqueName: \"kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd\") pod \"18b64226-8359-4191-bd14-3b5489307a67\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.770310 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities\") pod \"18b64226-8359-4191-bd14-3b5489307a67\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.770348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content\") pod \"18b64226-8359-4191-bd14-3b5489307a67\" (UID: \"18b64226-8359-4191-bd14-3b5489307a67\") " Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.771690 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities" (OuterVolumeSpecName: "utilities") pod "18b64226-8359-4191-bd14-3b5489307a67" (UID: "18b64226-8359-4191-bd14-3b5489307a67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.778209 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq"] Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.786623 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd" (OuterVolumeSpecName: "kube-api-access-6nwbd") pod "18b64226-8359-4191-bd14-3b5489307a67" (UID: "18b64226-8359-4191-bd14-3b5489307a67"). InnerVolumeSpecName "kube-api-access-6nwbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:55:21 crc kubenswrapper[4747]: W1205 20:55:21.803740 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6646e53_609f_4dc3_b299_aae36ce927fc.slice/crio-ccd824248f41dc4cfa7d4bac1c5ffdadfdc4f36339159917606e72271395288c WatchSource:0}: Error finding container ccd824248f41dc4cfa7d4bac1c5ffdadfdc4f36339159917606e72271395288c: Status 404 returned error can't find the container with id ccd824248f41dc4cfa7d4bac1c5ffdadfdc4f36339159917606e72271395288c Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.818633 4747 generic.go:334] "Generic (PLEG): container finished" podID="18b64226-8359-4191-bd14-3b5489307a67" containerID="f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a" exitCode=0 Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.818700 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerDied","Data":"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a"} Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.818743 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crtgv" event={"ID":"18b64226-8359-4191-bd14-3b5489307a67","Type":"ContainerDied","Data":"f33302b5456a35520fe039bc4d10bfe95b43e2e1a511c3dcbb4f984c9b87f2ad"} Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.818767 4747 scope.go:117] "RemoveContainer" containerID="f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.818947 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crtgv" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.872550 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nwbd\" (UniqueName: \"kubernetes.io/projected/18b64226-8359-4191-bd14-3b5489307a67-kube-api-access-6nwbd\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.872597 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.882345 4747 scope.go:117] "RemoveContainer" containerID="de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.904667 4747 scope.go:117] "RemoveContainer" containerID="3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.940846 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18b64226-8359-4191-bd14-3b5489307a67" (UID: "18b64226-8359-4191-bd14-3b5489307a67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.971123 4747 scope.go:117] "RemoveContainer" containerID="f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a" Dec 05 20:55:21 crc kubenswrapper[4747]: E1205 20:55:21.974689 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a\": container with ID starting with f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a not found: ID does not exist" containerID="f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.974720 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a"} err="failed to get container status \"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a\": rpc error: code = NotFound desc = could not find container \"f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a\": container with ID starting with f2b4c626987234955a3b505c1a8ee18ab1bc9af2e839f30f05cdf5d54cd09c5a not found: ID does not exist" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.974747 4747 scope.go:117] "RemoveContainer" containerID="de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0" Dec 05 20:55:21 crc kubenswrapper[4747]: E1205 20:55:21.975063 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0\": container with ID starting with de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0 not found: ID does not exist" containerID="de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.975111 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0"} err="failed to get container status \"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0\": rpc error: code = NotFound desc = could not find container \"de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0\": container with ID starting with de948e01fbe147fda5ac29fbb7cb361e30f8a61f22c90d61c02504d4c3d46af0 not found: ID does not exist" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.975123 4747 scope.go:117] "RemoveContainer" containerID="3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916" Dec 05 20:55:21 crc kubenswrapper[4747]: E1205 20:55:21.975440 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916\": container with ID starting with 3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916 not found: ID does not exist" containerID="3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.975455 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916"} err="failed to get container status \"3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916\": rpc error: code = NotFound desc = could not find container \"3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916\": container with ID starting with 3602e038d7b65aa8c7f23ac197686abf7fb20a9e44ee4beed4e87aff4aaef916 not found: ID does not exist" Dec 05 20:55:21 crc kubenswrapper[4747]: I1205 20:55:21.976872 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b64226-8359-4191-bd14-3b5489307a67-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:55:22 crc kubenswrapper[4747]: I1205 20:55:22.104151 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f"] Dec 05 20:55:22 crc kubenswrapper[4747]: W1205 20:55:22.109172 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a21b504_6988_45fc_88b9_4108efc34b06.slice/crio-6d849d610270c2955926143c20a1c758692ffa79f3308ec0a716a6e6eb2804b8 WatchSource:0}: Error finding container 6d849d610270c2955926143c20a1c758692ffa79f3308ec0a716a6e6eb2804b8: Status 404 returned error can't find the container with id 6d849d610270c2955926143c20a1c758692ffa79f3308ec0a716a6e6eb2804b8 Dec 05 20:55:22 crc kubenswrapper[4747]: I1205 20:55:22.150433 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:22 crc kubenswrapper[4747]: I1205 20:55:22.153858 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-crtgv"] Dec 05 20:55:22 crc kubenswrapper[4747]: I1205 20:55:22.826788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" event={"ID":"2a21b504-6988-45fc-88b9-4108efc34b06","Type":"ContainerStarted","Data":"6d849d610270c2955926143c20a1c758692ffa79f3308ec0a716a6e6eb2804b8"} Dec 05 20:55:22 crc kubenswrapper[4747]: I1205 20:55:22.828426 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" event={"ID":"a6646e53-609f-4dc3-b299-aae36ce927fc","Type":"ContainerStarted","Data":"ccd824248f41dc4cfa7d4bac1c5ffdadfdc4f36339159917606e72271395288c"} Dec 05 20:55:23 crc kubenswrapper[4747]: I1205 20:55:23.847738 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b64226-8359-4191-bd14-3b5489307a67" path="/var/lib/kubelet/pods/18b64226-8359-4191-bd14-3b5489307a67/volumes" Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.860358 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" event={"ID":"a6646e53-609f-4dc3-b299-aae36ce927fc","Type":"ContainerStarted","Data":"5f253f8e9774e7f72983ea8ba11e6f3133f6883b3980adab069fcbf7ac4c5713"} Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.860993 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.862764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" event={"ID":"2a21b504-6988-45fc-88b9-4108efc34b06","Type":"ContainerStarted","Data":"b318750dece3e138bf4417028a57c82ab6fd1eec3693a843cabd44c2e8ce112a"} Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.862922 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.882617 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" podStartSLOduration=2.925186983 podStartE2EDuration="7.882570563s" podCreationTimestamp="2025-12-05 20:55:20 +0000 UTC" firstStartedPulling="2025-12-05 20:55:21.81784182 +0000 UTC m=+792.285149308" lastFinishedPulling="2025-12-05 20:55:26.77522541 +0000 UTC m=+797.242532888" observedRunningTime="2025-12-05 20:55:27.880204733 +0000 UTC m=+798.347512251" watchObservedRunningTime="2025-12-05 20:55:27.882570563 +0000 UTC m=+798.349878071" Dec 05 20:55:27 crc kubenswrapper[4747]: I1205 20:55:27.901731 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" podStartSLOduration=2.237521883 podStartE2EDuration="6.901711171s" podCreationTimestamp="2025-12-05 20:55:21 +0000 UTC" firstStartedPulling="2025-12-05 20:55:22.114939911 +0000 UTC m=+792.582247409" lastFinishedPulling="2025-12-05 20:55:26.779129209 +0000 UTC m=+797.246436697" observedRunningTime="2025-12-05 20:55:27.900156461 +0000 UTC m=+798.367463959" watchObservedRunningTime="2025-12-05 20:55:27.901711171 +0000 UTC m=+798.369018659" Dec 05 20:55:41 crc kubenswrapper[4747]: I1205 20:55:41.746978 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d5d4d9899-rfr7f" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.244749 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c77dc54f8-9hjvq" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.976270 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd"] Dec 05 20:56:01 crc kubenswrapper[4747]: E1205 20:56:01.977150 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="extract-utilities" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.977172 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="extract-utilities" Dec 05 20:56:01 crc kubenswrapper[4747]: E1205 20:56:01.977184 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="extract-content" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.977192 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="extract-content" Dec 05 20:56:01 crc kubenswrapper[4747]: E1205 20:56:01.977204 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="registry-server" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.977211 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="registry-server" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.977325 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b64226-8359-4191-bd14-3b5489307a67" containerName="registry-server" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.977870 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.980350 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 20:56:01 crc kubenswrapper[4747]: I1205 20:56:01.981429 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lmjgz" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.044229 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2t7n7"] Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.047187 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.049475 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.049807 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.049924 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd"] Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.128919 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8z2dk"] Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.130309 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.132993 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.133173 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.133197 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-cgmpc" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.133419 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.138083 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-qs6g9"] Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.139498 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.144127 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.146913 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qs6g9"] Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.151162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-reloader\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.151217 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctps8\" (UniqueName: \"kubernetes.io/projected/393b222d-ddf3-41e6-92c8-7379f58219aa-kube-api-access-ctps8\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152380 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics-certs\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152453 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-sockets\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152525 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-conf\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152617 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152643 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczqk\" (UniqueName: \"kubernetes.io/projected/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-kube-api-access-vczqk\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.152673 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-startup\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.254208 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/1b95125e-a8ab-4566-bf3d-d901a5c4044b-kube-api-access-8wv4r\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.254329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metallb-excludel2\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.254397 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics-certs\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255703 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-sockets\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255806 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-metrics-certs\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255839 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-cert\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255871 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-conf\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255928 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255950 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczqk\" (UniqueName: \"kubernetes.io/projected/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-kube-api-access-vczqk\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.255981 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-startup\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metrics-certs\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256044 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-reloader\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctps8\" (UniqueName: \"kubernetes.io/projected/393b222d-ddf3-41e6-92c8-7379f58219aa-kube-api-access-ctps8\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256091 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr2v8\" (UniqueName: \"kubernetes.io/projected/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-kube-api-access-qr2v8\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256116 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256663 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-conf\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.256739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-sockets\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.257192 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.257570 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/393b222d-ddf3-41e6-92c8-7379f58219aa-reloader\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.259408 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/393b222d-ddf3-41e6-92c8-7379f58219aa-frr-startup\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.263413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/393b222d-ddf3-41e6-92c8-7379f58219aa-metrics-certs\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.264613 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.273890 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctps8\" (UniqueName: \"kubernetes.io/projected/393b222d-ddf3-41e6-92c8-7379f58219aa-kube-api-access-ctps8\") pod \"frr-k8s-2t7n7\" (UID: \"393b222d-ddf3-41e6-92c8-7379f58219aa\") " pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.274508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczqk\" (UniqueName: \"kubernetes.io/projected/26a6d099-70ce-4ce9-b75e-b7696ffe6dea-kube-api-access-vczqk\") pod \"frr-k8s-webhook-server-7fcb986d4-5blxd\" (UID: \"26a6d099-70ce-4ce9-b75e-b7696ffe6dea\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.349929 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357514 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/1b95125e-a8ab-4566-bf3d-d901a5c4044b-kube-api-access-8wv4r\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357571 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metallb-excludel2\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-metrics-certs\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357707 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-cert\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357756 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metrics-certs\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357789 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr2v8\" (UniqueName: \"kubernetes.io/projected/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-kube-api-access-qr2v8\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.357814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: E1205 20:56:02.357969 4747 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:56:02 crc kubenswrapper[4747]: E1205 20:56:02.358053 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist podName:1b95125e-a8ab-4566-bf3d-d901a5c4044b nodeName:}" failed. No retries permitted until 2025-12-05 20:56:02.858029225 +0000 UTC m=+833.325336703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist") pod "speaker-8z2dk" (UID: "1b95125e-a8ab-4566-bf3d-d901a5c4044b") : secret "metallb-memberlist" not found Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.360117 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metallb-excludel2\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.365028 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-cert\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.365107 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-metrics-certs\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.366411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-metrics-certs\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.383262 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/1b95125e-a8ab-4566-bf3d-d901a5c4044b-kube-api-access-8wv4r\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.384520 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr2v8\" (UniqueName: \"kubernetes.io/projected/8ff67434-1b15-4bcd-a222-175dcaf8dbbb-kube-api-access-qr2v8\") pod \"controller-f8648f98b-qs6g9\" (UID: \"8ff67434-1b15-4bcd-a222-175dcaf8dbbb\") " pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.409387 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.457048 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.760264 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd"] Dec 05 20:56:02 crc kubenswrapper[4747]: W1205 20:56:02.769500 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26a6d099_70ce_4ce9_b75e_b7696ffe6dea.slice/crio-ee9146ea9d23f340a0497ede45519c05dbdceb133d26d1e61060f7f4f96a4736 WatchSource:0}: Error finding container ee9146ea9d23f340a0497ede45519c05dbdceb133d26d1e61060f7f4f96a4736: Status 404 returned error can't find the container with id ee9146ea9d23f340a0497ede45519c05dbdceb133d26d1e61060f7f4f96a4736 Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.864627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:02 crc kubenswrapper[4747]: E1205 20:56:02.864843 4747 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:56:02 crc kubenswrapper[4747]: E1205 20:56:02.864902 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist podName:1b95125e-a8ab-4566-bf3d-d901a5c4044b nodeName:}" failed. No retries permitted until 2025-12-05 20:56:03.864884725 +0000 UTC m=+834.332192213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist") pod "speaker-8z2dk" (UID: "1b95125e-a8ab-4566-bf3d-d901a5c4044b") : secret "metallb-memberlist" not found Dec 05 20:56:02 crc kubenswrapper[4747]: I1205 20:56:02.865895 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qs6g9"] Dec 05 20:56:02 crc kubenswrapper[4747]: W1205 20:56:02.872566 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff67434_1b15_4bcd_a222_175dcaf8dbbb.slice/crio-83dfd0d45c83565f506e1f7e1ceafbc8f5dedf1b26ea78bc87b9fa535b6bb571 WatchSource:0}: Error finding container 83dfd0d45c83565f506e1f7e1ceafbc8f5dedf1b26ea78bc87b9fa535b6bb571: Status 404 returned error can't find the container with id 83dfd0d45c83565f506e1f7e1ceafbc8f5dedf1b26ea78bc87b9fa535b6bb571 Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.114486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"fb6e300cf48795dc34802f6bda341054c798a9dde44dc62cd4dd92eda798e9fa"} Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.115966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" event={"ID":"26a6d099-70ce-4ce9-b75e-b7696ffe6dea","Type":"ContainerStarted","Data":"ee9146ea9d23f340a0497ede45519c05dbdceb133d26d1e61060f7f4f96a4736"} Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.118789 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qs6g9" event={"ID":"8ff67434-1b15-4bcd-a222-175dcaf8dbbb","Type":"ContainerStarted","Data":"9157ac377cff6edddbc2496371d8c9671edfea4296624cd01a4c5d8df26de1bf"} Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.118840 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qs6g9" event={"ID":"8ff67434-1b15-4bcd-a222-175dcaf8dbbb","Type":"ContainerStarted","Data":"83dfd0d45c83565f506e1f7e1ceafbc8f5dedf1b26ea78bc87b9fa535b6bb571"} Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.118940 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.140383 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-qs6g9" podStartSLOduration=1.140361906 podStartE2EDuration="1.140361906s" podCreationTimestamp="2025-12-05 20:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:56:03.134109807 +0000 UTC m=+833.601417305" watchObservedRunningTime="2025-12-05 20:56:03.140361906 +0000 UTC m=+833.607669404" Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.877593 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.897368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b95125e-a8ab-4566-bf3d-d901a5c4044b-memberlist\") pod \"speaker-8z2dk\" (UID: \"1b95125e-a8ab-4566-bf3d-d901a5c4044b\") " pod="metallb-system/speaker-8z2dk" Dec 05 20:56:03 crc kubenswrapper[4747]: I1205 20:56:03.944846 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8z2dk" Dec 05 20:56:04 crc kubenswrapper[4747]: I1205 20:56:04.133503 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qs6g9" event={"ID":"8ff67434-1b15-4bcd-a222-175dcaf8dbbb","Type":"ContainerStarted","Data":"5c803b7510ce0ec1188e317e714ebf7f9d02c20c3a2c7938e569d3171399c4f1"} Dec 05 20:56:04 crc kubenswrapper[4747]: I1205 20:56:04.137137 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8z2dk" event={"ID":"1b95125e-a8ab-4566-bf3d-d901a5c4044b","Type":"ContainerStarted","Data":"6f27d193d83858d5c61fe42ddf38ed671096a1ddbf10300a904b8b58b9221caa"} Dec 05 20:56:05 crc kubenswrapper[4747]: I1205 20:56:05.145695 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8z2dk" event={"ID":"1b95125e-a8ab-4566-bf3d-d901a5c4044b","Type":"ContainerStarted","Data":"4e75397739844fe76a6c49057f0441ea00df5117a7b22e15ba909a27f75a0932"} Dec 05 20:56:05 crc kubenswrapper[4747]: I1205 20:56:05.147263 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8z2dk" event={"ID":"1b95125e-a8ab-4566-bf3d-d901a5c4044b","Type":"ContainerStarted","Data":"2dbe100e8b3deb59c7ec5c13bbc6d922d8e112acfdda5fdb6c913b31a3ce1049"} Dec 05 20:56:05 crc kubenswrapper[4747]: I1205 20:56:05.147311 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8z2dk" Dec 05 20:56:05 crc kubenswrapper[4747]: I1205 20:56:05.179137 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8z2dk" podStartSLOduration=3.179115134 podStartE2EDuration="3.179115134s" podCreationTimestamp="2025-12-05 20:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:56:05.173780258 +0000 UTC m=+835.641087746" watchObservedRunningTime="2025-12-05 20:56:05.179115134 +0000 UTC m=+835.646422622" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.007500 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.027248 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.037885 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.090155 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.090284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.090356 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4zv\" (UniqueName: \"kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.184199 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" event={"ID":"26a6d099-70ce-4ce9-b75e-b7696ffe6dea","Type":"ContainerStarted","Data":"f6a0844ab09aa612486ec3c0708a48d60f0bfd3782b86beaf76b933ee927ca0d"} Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.184281 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.186745 4747 generic.go:334] "Generic (PLEG): container finished" podID="393b222d-ddf3-41e6-92c8-7379f58219aa" containerID="0bff9d4113b74e810c89b82dd701520518479a68dafaadcde8339aef8ed8e4eb" exitCode=0 Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.186808 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerDied","Data":"0bff9d4113b74e810c89b82dd701520518479a68dafaadcde8339aef8ed8e4eb"} Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.190898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4zv\" (UniqueName: \"kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.190952 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.191029 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.191533 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.191659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.204961 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" podStartSLOduration=2.482982106 podStartE2EDuration="9.204936206s" podCreationTimestamp="2025-12-05 20:56:01 +0000 UTC" firstStartedPulling="2025-12-05 20:56:02.772114014 +0000 UTC m=+833.239421522" lastFinishedPulling="2025-12-05 20:56:09.494068134 +0000 UTC m=+839.961375622" observedRunningTime="2025-12-05 20:56:10.203476559 +0000 UTC m=+840.670784047" watchObservedRunningTime="2025-12-05 20:56:10.204936206 +0000 UTC m=+840.672243694" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.221633 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4zv\" (UniqueName: \"kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv\") pod \"redhat-marketplace-n2kl2\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.352369 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:10 crc kubenswrapper[4747]: E1205 20:56:10.479919 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod393b222d_ddf3_41e6_92c8_7379f58219aa.slice/crio-70b4786942af6cf65ba5f6f886474d382d1813cc985f3f55be6be5f5fc973412.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod393b222d_ddf3_41e6_92c8_7379f58219aa.slice/crio-conmon-70b4786942af6cf65ba5f6f886474d382d1813cc985f3f55be6be5f5fc973412.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:56:10 crc kubenswrapper[4747]: I1205 20:56:10.607252 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:11 crc kubenswrapper[4747]: I1205 20:56:11.193793 4747 generic.go:334] "Generic (PLEG): container finished" podID="393b222d-ddf3-41e6-92c8-7379f58219aa" containerID="70b4786942af6cf65ba5f6f886474d382d1813cc985f3f55be6be5f5fc973412" exitCode=0 Dec 05 20:56:11 crc kubenswrapper[4747]: I1205 20:56:11.193832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerDied","Data":"70b4786942af6cf65ba5f6f886474d382d1813cc985f3f55be6be5f5fc973412"} Dec 05 20:56:11 crc kubenswrapper[4747]: I1205 20:56:11.195464 4747 generic.go:334] "Generic (PLEG): container finished" podID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerID="7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956" exitCode=0 Dec 05 20:56:11 crc kubenswrapper[4747]: I1205 20:56:11.195484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerDied","Data":"7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956"} Dec 05 20:56:11 crc kubenswrapper[4747]: I1205 20:56:11.195509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerStarted","Data":"bd4503ded270b13bcd3fec7998c9224046298f5fadb31bafed493558ce6f3701"} Dec 05 20:56:12 crc kubenswrapper[4747]: I1205 20:56:12.211902 4747 generic.go:334] "Generic (PLEG): container finished" podID="393b222d-ddf3-41e6-92c8-7379f58219aa" containerID="7c90e1b88a095e53f97fffd537deaaa5c078c8f38f02ec1b5439c479c6eaec53" exitCode=0 Dec 05 20:56:12 crc kubenswrapper[4747]: I1205 20:56:12.211973 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerDied","Data":"7c90e1b88a095e53f97fffd537deaaa5c078c8f38f02ec1b5439c479c6eaec53"} Dec 05 20:56:12 crc kubenswrapper[4747]: I1205 20:56:12.215819 4747 generic.go:334] "Generic (PLEG): container finished" podID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerID="d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58" exitCode=0 Dec 05 20:56:12 crc kubenswrapper[4747]: I1205 20:56:12.215885 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerDied","Data":"d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58"} Dec 05 20:56:12 crc kubenswrapper[4747]: I1205 20:56:12.460906 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-qs6g9" Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.228856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerStarted","Data":"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf"} Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.251296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"21e5b72db90ca3781f9bf7b6a693b6013ca3cb0a4feee93efdcb5242229b0f91"} Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.251362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"afb2f7a057b7ddd5a550c82cd2548e5312f30efe3cb0096330f5b1a9f1683483"} Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.251374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"cb37a4ec7f942d91fabdc4d83d4481e26067619a8891f8a6e3cea15c00d1fa9d"} Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.251387 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"b17a5d03f30174049791014f6428d7a58c2fa17f30425180a43a90e31b6cd368"} Dec 05 20:56:13 crc kubenswrapper[4747]: I1205 20:56:13.260239 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n2kl2" podStartSLOduration=2.389956767 podStartE2EDuration="4.260218706s" podCreationTimestamp="2025-12-05 20:56:09 +0000 UTC" firstStartedPulling="2025-12-05 20:56:11.196657477 +0000 UTC m=+841.663964965" lastFinishedPulling="2025-12-05 20:56:13.066919416 +0000 UTC m=+843.534226904" observedRunningTime="2025-12-05 20:56:13.256829529 +0000 UTC m=+843.724137027" watchObservedRunningTime="2025-12-05 20:56:13.260218706 +0000 UTC m=+843.727526194" Dec 05 20:56:14 crc kubenswrapper[4747]: I1205 20:56:14.262797 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"f5d999186ab58a739cd03444e5a325e12a23c45501a04030e88be5ce31734526"} Dec 05 20:56:14 crc kubenswrapper[4747]: I1205 20:56:14.263107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2t7n7" event={"ID":"393b222d-ddf3-41e6-92c8-7379f58219aa","Type":"ContainerStarted","Data":"36f9ec9d1f0654c4dbffab4d6fdd624bd2e7dafb87550fd4ff34035924851620"} Dec 05 20:56:14 crc kubenswrapper[4747]: I1205 20:56:14.290257 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2t7n7" podStartSLOduration=6.310099301 podStartE2EDuration="13.290239921s" podCreationTimestamp="2025-12-05 20:56:01 +0000 UTC" firstStartedPulling="2025-12-05 20:56:02.535169474 +0000 UTC m=+833.002476962" lastFinishedPulling="2025-12-05 20:56:09.515310084 +0000 UTC m=+839.982617582" observedRunningTime="2025-12-05 20:56:14.285832369 +0000 UTC m=+844.753139857" watchObservedRunningTime="2025-12-05 20:56:14.290239921 +0000 UTC m=+844.757547399" Dec 05 20:56:15 crc kubenswrapper[4747]: I1205 20:56:15.268617 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:17 crc kubenswrapper[4747]: I1205 20:56:17.410177 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:17 crc kubenswrapper[4747]: I1205 20:56:17.469085 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:20 crc kubenswrapper[4747]: I1205 20:56:20.352846 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:20 crc kubenswrapper[4747]: I1205 20:56:20.353191 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:20 crc kubenswrapper[4747]: I1205 20:56:20.394296 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:21 crc kubenswrapper[4747]: I1205 20:56:21.399714 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:21 crc kubenswrapper[4747]: I1205 20:56:21.455120 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:22 crc kubenswrapper[4747]: I1205 20:56:22.357446 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5blxd" Dec 05 20:56:22 crc kubenswrapper[4747]: I1205 20:56:22.414701 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2t7n7" Dec 05 20:56:23 crc kubenswrapper[4747]: I1205 20:56:23.345169 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n2kl2" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="registry-server" containerID="cri-o://74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf" gracePeriod=2 Dec 05 20:56:23 crc kubenswrapper[4747]: I1205 20:56:23.950022 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8z2dk" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.255617 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.302876 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities\") pod \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.302986 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf4zv\" (UniqueName: \"kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv\") pod \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.303031 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content\") pod \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\" (UID: \"3865dffd-c7cb-4b09-ad4d-74ab1bf46713\") " Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.305334 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities" (OuterVolumeSpecName: "utilities") pod "3865dffd-c7cb-4b09-ad4d-74ab1bf46713" (UID: "3865dffd-c7cb-4b09-ad4d-74ab1bf46713"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.310557 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv" (OuterVolumeSpecName: "kube-api-access-hf4zv") pod "3865dffd-c7cb-4b09-ad4d-74ab1bf46713" (UID: "3865dffd-c7cb-4b09-ad4d-74ab1bf46713"). InnerVolumeSpecName "kube-api-access-hf4zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.320410 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3865dffd-c7cb-4b09-ad4d-74ab1bf46713" (UID: "3865dffd-c7cb-4b09-ad4d-74ab1bf46713"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.353170 4747 generic.go:334] "Generic (PLEG): container finished" podID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerID="74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf" exitCode=0 Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.353217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerDied","Data":"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf"} Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.353252 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2kl2" event={"ID":"3865dffd-c7cb-4b09-ad4d-74ab1bf46713","Type":"ContainerDied","Data":"bd4503ded270b13bcd3fec7998c9224046298f5fadb31bafed493558ce6f3701"} Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.353274 4747 scope.go:117] "RemoveContainer" containerID="74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.353300 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2kl2" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.402058 4747 scope.go:117] "RemoveContainer" containerID="d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.404750 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf4zv\" (UniqueName: \"kubernetes.io/projected/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-kube-api-access-hf4zv\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.404786 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.404801 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865dffd-c7cb-4b09-ad4d-74ab1bf46713-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.405297 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.413314 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2kl2"] Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.439468 4747 scope.go:117] "RemoveContainer" containerID="7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.471399 4747 scope.go:117] "RemoveContainer" containerID="74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf" Dec 05 20:56:24 crc kubenswrapper[4747]: E1205 20:56:24.472185 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf\": container with ID starting with 74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf not found: ID does not exist" containerID="74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.472223 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf"} err="failed to get container status \"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf\": rpc error: code = NotFound desc = could not find container \"74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf\": container with ID starting with 74b2151d35f5f5f444efb2d0669ea96f1705fe4227f14fc932af2a95dfeeabbf not found: ID does not exist" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.472252 4747 scope.go:117] "RemoveContainer" containerID="d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58" Dec 05 20:56:24 crc kubenswrapper[4747]: E1205 20:56:24.472568 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58\": container with ID starting with d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58 not found: ID does not exist" containerID="d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.472777 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58"} err="failed to get container status \"d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58\": rpc error: code = NotFound desc = could not find container \"d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58\": container with ID starting with d720296cf534bc8b64eac5f6ccc02e61df4b38b2a4768618125c275a25f0fa58 not found: ID does not exist" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.472813 4747 scope.go:117] "RemoveContainer" containerID="7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956" Dec 05 20:56:24 crc kubenswrapper[4747]: E1205 20:56:24.473239 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956\": container with ID starting with 7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956 not found: ID does not exist" containerID="7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956" Dec 05 20:56:24 crc kubenswrapper[4747]: I1205 20:56:24.473308 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956"} err="failed to get container status \"7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956\": rpc error: code = NotFound desc = could not find container \"7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956\": container with ID starting with 7ad5a13ce825d1d725e51ff56fdd0d9036ed6ce6f2ad471dc7ec754a0eb4d956 not found: ID does not exist" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.149065 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt"] Dec 05 20:56:25 crc kubenswrapper[4747]: E1205 20:56:25.149455 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="extract-content" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.149468 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="extract-content" Dec 05 20:56:25 crc kubenswrapper[4747]: E1205 20:56:25.149501 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="extract-utilities" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.149507 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="extract-utilities" Dec 05 20:56:25 crc kubenswrapper[4747]: E1205 20:56:25.149516 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="registry-server" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.149522 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="registry-server" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.149640 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" containerName="registry-server" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.150499 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.153072 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.160733 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt"] Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.216409 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcnpn\" (UniqueName: \"kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.216489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.216557 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.318097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.318194 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcnpn\" (UniqueName: \"kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.318230 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.318761 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.318839 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.335260 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcnpn\" (UniqueName: \"kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.508134 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.710738 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt"] Dec 05 20:56:25 crc kubenswrapper[4747]: I1205 20:56:25.846622 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3865dffd-c7cb-4b09-ad4d-74ab1bf46713" path="/var/lib/kubelet/pods/3865dffd-c7cb-4b09-ad4d-74ab1bf46713/volumes" Dec 05 20:56:26 crc kubenswrapper[4747]: I1205 20:56:26.367106 4747 generic.go:334] "Generic (PLEG): container finished" podID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerID="f05d51700a8f0ebb3d81350c95506b22337697b63b8a56c8562d5313742b2255" exitCode=0 Dec 05 20:56:26 crc kubenswrapper[4747]: I1205 20:56:26.367179 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" event={"ID":"5fe45ed8-9c96-4564-9e57-f4eaef897ef4","Type":"ContainerDied","Data":"f05d51700a8f0ebb3d81350c95506b22337697b63b8a56c8562d5313742b2255"} Dec 05 20:56:26 crc kubenswrapper[4747]: I1205 20:56:26.367418 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" event={"ID":"5fe45ed8-9c96-4564-9e57-f4eaef897ef4","Type":"ContainerStarted","Data":"90d9627b6db62cffe6fae60d137d39727d75fe4c4a7c29461d6c420d4815fd3a"} Dec 05 20:56:30 crc kubenswrapper[4747]: I1205 20:56:30.395191 4747 generic.go:334] "Generic (PLEG): container finished" podID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerID="5017d1f25000b0ee8cdd581120c4c93d953ab1f0465eab07163bc59ff8722edd" exitCode=0 Dec 05 20:56:30 crc kubenswrapper[4747]: I1205 20:56:30.395284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" event={"ID":"5fe45ed8-9c96-4564-9e57-f4eaef897ef4","Type":"ContainerDied","Data":"5017d1f25000b0ee8cdd581120c4c93d953ab1f0465eab07163bc59ff8722edd"} Dec 05 20:56:31 crc kubenswrapper[4747]: I1205 20:56:31.409643 4747 generic.go:334] "Generic (PLEG): container finished" podID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerID="e6568b4b1fbc3aaaee572cd46278832d688e4f91c78f246fcfa6c5b47489ee82" exitCode=0 Dec 05 20:56:31 crc kubenswrapper[4747]: I1205 20:56:31.409804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" event={"ID":"5fe45ed8-9c96-4564-9e57-f4eaef897ef4","Type":"ContainerDied","Data":"e6568b4b1fbc3aaaee572cd46278832d688e4f91c78f246fcfa6c5b47489ee82"} Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.732382 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.834170 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcnpn\" (UniqueName: \"kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn\") pod \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.834672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle\") pod \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.834715 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util\") pod \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\" (UID: \"5fe45ed8-9c96-4564-9e57-f4eaef897ef4\") " Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.836016 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle" (OuterVolumeSpecName: "bundle") pod "5fe45ed8-9c96-4564-9e57-f4eaef897ef4" (UID: "5fe45ed8-9c96-4564-9e57-f4eaef897ef4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.839434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn" (OuterVolumeSpecName: "kube-api-access-bcnpn") pod "5fe45ed8-9c96-4564-9e57-f4eaef897ef4" (UID: "5fe45ed8-9c96-4564-9e57-f4eaef897ef4"). InnerVolumeSpecName "kube-api-access-bcnpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.856920 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util" (OuterVolumeSpecName: "util") pod "5fe45ed8-9c96-4564-9e57-f4eaef897ef4" (UID: "5fe45ed8-9c96-4564-9e57-f4eaef897ef4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.936480 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcnpn\" (UniqueName: \"kubernetes.io/projected/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-kube-api-access-bcnpn\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.936563 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:32 crc kubenswrapper[4747]: I1205 20:56:32.936712 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe45ed8-9c96-4564-9e57-f4eaef897ef4-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:33 crc kubenswrapper[4747]: I1205 20:56:33.427066 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" event={"ID":"5fe45ed8-9c96-4564-9e57-f4eaef897ef4","Type":"ContainerDied","Data":"90d9627b6db62cffe6fae60d137d39727d75fe4c4a7c29461d6c420d4815fd3a"} Dec 05 20:56:33 crc kubenswrapper[4747]: I1205 20:56:33.427109 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d9627b6db62cffe6fae60d137d39727d75fe4c4a7c29461d6c420d4815fd3a" Dec 05 20:56:33 crc kubenswrapper[4747]: I1205 20:56:33.427184 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.098450 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:34 crc kubenswrapper[4747]: E1205 20:56:34.098849 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="pull" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.098871 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="pull" Dec 05 20:56:34 crc kubenswrapper[4747]: E1205 20:56:34.098890 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="util" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.098897 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="util" Dec 05 20:56:34 crc kubenswrapper[4747]: E1205 20:56:34.098917 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="extract" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.098924 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="extract" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.099081 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe45ed8-9c96-4564-9e57-f4eaef897ef4" containerName="extract" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.100127 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.116348 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.153082 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.153196 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh7kr\" (UniqueName: \"kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.153237 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.254102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.254199 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.254221 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh7kr\" (UniqueName: \"kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.254810 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.255261 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.308281 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh7kr\" (UniqueName: \"kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr\") pod \"community-operators-g7ldk\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:34 crc kubenswrapper[4747]: I1205 20:56:34.423835 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.070661 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.445890 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerStarted","Data":"59911b7bf9b2617f4f230d3a83ada9e2754be3eb1e7116ada05b4eaf99a1234a"} Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.577570 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk"] Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.578529 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.581826 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.581897 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.582144 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lrqpd" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.601873 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk"] Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.682452 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2tff\" (UniqueName: \"kubernetes.io/projected/91127818-ec0e-4815-b32e-f99011acfdd3-kube-api-access-v2tff\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.682543 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91127818-ec0e-4815-b32e-f99011acfdd3-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.784122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2tff\" (UniqueName: \"kubernetes.io/projected/91127818-ec0e-4815-b32e-f99011acfdd3-kube-api-access-v2tff\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.784237 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91127818-ec0e-4815-b32e-f99011acfdd3-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.784906 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91127818-ec0e-4815-b32e-f99011acfdd3-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.812928 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2tff\" (UniqueName: \"kubernetes.io/projected/91127818-ec0e-4815-b32e-f99011acfdd3-kube-api-access-v2tff\") pod \"cert-manager-operator-controller-manager-64cf6dff88-86npk\" (UID: \"91127818-ec0e-4815-b32e-f99011acfdd3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:35 crc kubenswrapper[4747]: I1205 20:56:35.965719 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" Dec 05 20:56:36 crc kubenswrapper[4747]: I1205 20:56:36.181639 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk"] Dec 05 20:56:36 crc kubenswrapper[4747]: I1205 20:56:36.454947 4747 generic.go:334] "Generic (PLEG): container finished" podID="44760008-f130-4688-8cd7-485d61491efb" containerID="ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71" exitCode=0 Dec 05 20:56:36 crc kubenswrapper[4747]: I1205 20:56:36.455043 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerDied","Data":"ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71"} Dec 05 20:56:36 crc kubenswrapper[4747]: I1205 20:56:36.456677 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" event={"ID":"91127818-ec0e-4815-b32e-f99011acfdd3","Type":"ContainerStarted","Data":"ff3e3e6f7ec8cef89668d1406a4024b21310e4563024a751c8813e02c25935f6"} Dec 05 20:56:37 crc kubenswrapper[4747]: I1205 20:56:37.475158 4747 generic.go:334] "Generic (PLEG): container finished" podID="44760008-f130-4688-8cd7-485d61491efb" containerID="63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0" exitCode=0 Dec 05 20:56:37 crc kubenswrapper[4747]: I1205 20:56:37.475262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerDied","Data":"63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0"} Dec 05 20:56:38 crc kubenswrapper[4747]: I1205 20:56:38.485319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerStarted","Data":"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37"} Dec 05 20:56:38 crc kubenswrapper[4747]: I1205 20:56:38.509791 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g7ldk" podStartSLOduration=3.094631673 podStartE2EDuration="4.509765308s" podCreationTimestamp="2025-12-05 20:56:34 +0000 UTC" firstStartedPulling="2025-12-05 20:56:36.456937573 +0000 UTC m=+866.924245051" lastFinishedPulling="2025-12-05 20:56:37.872071188 +0000 UTC m=+868.339378686" observedRunningTime="2025-12-05 20:56:38.508428544 +0000 UTC m=+868.975736052" watchObservedRunningTime="2025-12-05 20:56:38.509765308 +0000 UTC m=+868.977072796" Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.424773 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.425301 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.472493 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.526040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" event={"ID":"91127818-ec0e-4815-b32e-f99011acfdd3","Type":"ContainerStarted","Data":"ea663b840aeab6449fc7a420f55fbbf1e0c079cc63d8628ff77351f853ac971c"} Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.548626 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-86npk" podStartSLOduration=1.652113141 podStartE2EDuration="9.548606522s" podCreationTimestamp="2025-12-05 20:56:35 +0000 UTC" firstStartedPulling="2025-12-05 20:56:36.194082183 +0000 UTC m=+866.661389671" lastFinishedPulling="2025-12-05 20:56:44.090575564 +0000 UTC m=+874.557883052" observedRunningTime="2025-12-05 20:56:44.547340809 +0000 UTC m=+875.014648327" watchObservedRunningTime="2025-12-05 20:56:44.548606522 +0000 UTC m=+875.015914010" Dec 05 20:56:44 crc kubenswrapper[4747]: I1205 20:56:44.573551 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:46 crc kubenswrapper[4747]: I1205 20:56:46.692764 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:46 crc kubenswrapper[4747]: I1205 20:56:46.692987 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g7ldk" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="registry-server" containerID="cri-o://09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37" gracePeriod=2 Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.105832 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.256047 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content\") pod \"44760008-f130-4688-8cd7-485d61491efb\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.256200 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh7kr\" (UniqueName: \"kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr\") pod \"44760008-f130-4688-8cd7-485d61491efb\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.256236 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities\") pod \"44760008-f130-4688-8cd7-485d61491efb\" (UID: \"44760008-f130-4688-8cd7-485d61491efb\") " Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.258135 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities" (OuterVolumeSpecName: "utilities") pod "44760008-f130-4688-8cd7-485d61491efb" (UID: "44760008-f130-4688-8cd7-485d61491efb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.270819 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr" (OuterVolumeSpecName: "kube-api-access-fh7kr") pod "44760008-f130-4688-8cd7-485d61491efb" (UID: "44760008-f130-4688-8cd7-485d61491efb"). InnerVolumeSpecName "kube-api-access-fh7kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.311165 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44760008-f130-4688-8cd7-485d61491efb" (UID: "44760008-f130-4688-8cd7-485d61491efb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.358518 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.358560 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh7kr\" (UniqueName: \"kubernetes.io/projected/44760008-f130-4688-8cd7-485d61491efb-kube-api-access-fh7kr\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.358573 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44760008-f130-4688-8cd7-485d61491efb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.543861 4747 generic.go:334] "Generic (PLEG): container finished" podID="44760008-f130-4688-8cd7-485d61491efb" containerID="09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37" exitCode=0 Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.543905 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerDied","Data":"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37"} Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.543932 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ldk" event={"ID":"44760008-f130-4688-8cd7-485d61491efb","Type":"ContainerDied","Data":"59911b7bf9b2617f4f230d3a83ada9e2754be3eb1e7116ada05b4eaf99a1234a"} Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.543948 4747 scope.go:117] "RemoveContainer" containerID="09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.544374 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ldk" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.569642 4747 scope.go:117] "RemoveContainer" containerID="63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.586141 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.593052 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g7ldk"] Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.599100 4747 scope.go:117] "RemoveContainer" containerID="ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.616793 4747 scope.go:117] "RemoveContainer" containerID="09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37" Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.617443 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37\": container with ID starting with 09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37 not found: ID does not exist" containerID="09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.617486 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37"} err="failed to get container status \"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37\": rpc error: code = NotFound desc = could not find container \"09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37\": container with ID starting with 09b5d4ece1bb199dab9f3620d10964fc9a9ed8213725304ba1e16f11cf114f37 not found: ID does not exist" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.617520 4747 scope.go:117] "RemoveContainer" containerID="63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0" Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.618103 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0\": container with ID starting with 63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0 not found: ID does not exist" containerID="63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.618206 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0"} err="failed to get container status \"63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0\": rpc error: code = NotFound desc = could not find container \"63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0\": container with ID starting with 63fb8a1b3c9637a6f0b0caabc9b671227e755ca89261b9dc5deef342fb7063d0 not found: ID does not exist" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.618295 4747 scope.go:117] "RemoveContainer" containerID="ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71" Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.618790 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71\": container with ID starting with ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71 not found: ID does not exist" containerID="ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.618816 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71"} err="failed to get container status \"ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71\": rpc error: code = NotFound desc = could not find container \"ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71\": container with ID starting with ea32cc6138ff8ddba96edec34fd4d7f0a3729bf704f82a6a329a427c1bfc3d71 not found: ID does not exist" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.847432 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44760008-f130-4688-8cd7-485d61491efb" path="/var/lib/kubelet/pods/44760008-f130-4688-8cd7-485d61491efb/volumes" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.962944 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zltlr"] Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.963354 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="extract-utilities" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.963381 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="extract-utilities" Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.963409 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="registry-server" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.963419 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="registry-server" Dec 05 20:56:47 crc kubenswrapper[4747]: E1205 20:56:47.963430 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="extract-content" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.963440 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="extract-content" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.963622 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="44760008-f130-4688-8cd7-485d61491efb" containerName="registry-server" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.964409 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.966735 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xjfqv" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.968500 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.969287 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 20:56:47 crc kubenswrapper[4747]: I1205 20:56:47.978144 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zltlr"] Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.090925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5q7\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-kube-api-access-hw5q7\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.091008 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.191876 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5q7\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-kube-api-access-hw5q7\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.191935 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.208105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.216200 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5q7\" (UniqueName: \"kubernetes.io/projected/03ba7ba4-8a3a-47b4-8996-cb3f8f9642af-kube-api-access-hw5q7\") pod \"cert-manager-webhook-f4fb5df64-zltlr\" (UID: \"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.292013 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:48 crc kubenswrapper[4747]: I1205 20:56:48.768574 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zltlr"] Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.189814 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-cr86z"] Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.190721 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.192446 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2xrmp" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.199897 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-cr86z"] Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.207452 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts276\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-kube-api-access-ts276\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.207505 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.309372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts276\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-kube-api-access-ts276\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.309683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.331099 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts276\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-kube-api-access-ts276\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.331617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2a92677-713b-4051-86dc-895902e83027-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-cr86z\" (UID: \"d2a92677-713b-4051-86dc-895902e83027\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.506387 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.559682 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" event={"ID":"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af","Type":"ContainerStarted","Data":"497296fbb2f12667ca93561bbfbb21d2d4fa279426158305b89e70129f08b311"} Dec 05 20:56:49 crc kubenswrapper[4747]: I1205 20:56:49.956933 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-cr86z"] Dec 05 20:56:50 crc kubenswrapper[4747]: I1205 20:56:50.566704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" event={"ID":"d2a92677-713b-4051-86dc-895902e83027","Type":"ContainerStarted","Data":"e7c6e620aadbc335f0b4eeb4e58f976296c5ced59e2857f9e57a78d3a30144cc"} Dec 05 20:56:57 crc kubenswrapper[4747]: I1205 20:56:57.623794 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" event={"ID":"d2a92677-713b-4051-86dc-895902e83027","Type":"ContainerStarted","Data":"02a43d18d42512d3c98fa98ed70806cf7043744fe9db1652bf7839fbf0789bda"} Dec 05 20:56:57 crc kubenswrapper[4747]: I1205 20:56:57.626786 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" event={"ID":"03ba7ba4-8a3a-47b4-8996-cb3f8f9642af","Type":"ContainerStarted","Data":"271e0d2d4723298650d1308cc29e8ea26b81405e2d59aaf84f2be94d617e78b8"} Dec 05 20:56:57 crc kubenswrapper[4747]: I1205 20:56:57.626925 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:56:57 crc kubenswrapper[4747]: I1205 20:56:57.642268 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-cr86z" podStartSLOduration=1.51490016 podStartE2EDuration="8.642251147s" podCreationTimestamp="2025-12-05 20:56:49 +0000 UTC" firstStartedPulling="2025-12-05 20:56:49.967295293 +0000 UTC m=+880.434602781" lastFinishedPulling="2025-12-05 20:56:57.09464628 +0000 UTC m=+887.561953768" observedRunningTime="2025-12-05 20:56:57.641869977 +0000 UTC m=+888.109177505" watchObservedRunningTime="2025-12-05 20:56:57.642251147 +0000 UTC m=+888.109558635" Dec 05 20:56:57 crc kubenswrapper[4747]: I1205 20:56:57.684834 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" podStartSLOduration=2.399878331 podStartE2EDuration="10.68479995s" podCreationTimestamp="2025-12-05 20:56:47 +0000 UTC" firstStartedPulling="2025-12-05 20:56:48.782261102 +0000 UTC m=+879.249568590" lastFinishedPulling="2025-12-05 20:56:57.067182711 +0000 UTC m=+887.534490209" observedRunningTime="2025-12-05 20:56:57.67497892 +0000 UTC m=+888.142286408" watchObservedRunningTime="2025-12-05 20:56:57.68479995 +0000 UTC m=+888.152107458" Dec 05 20:57:00 crc kubenswrapper[4747]: I1205 20:57:00.995517 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-prvhw"] Dec 05 20:57:00 crc kubenswrapper[4747]: I1205 20:57:00.996547 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.003416 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8k886" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.015015 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-prvhw"] Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.173544 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsrd\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-kube-api-access-rtsrd\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.173667 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-bound-sa-token\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.274494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-bound-sa-token\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.274672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsrd\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-kube-api-access-rtsrd\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.303671 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-bound-sa-token\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.305208 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsrd\" (UniqueName: \"kubernetes.io/projected/604860ec-c0f5-4280-a534-798d3b88cf8e-kube-api-access-rtsrd\") pod \"cert-manager-86cb77c54b-prvhw\" (UID: \"604860ec-c0f5-4280-a534-798d3b88cf8e\") " pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.359884 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-prvhw" Dec 05 20:57:01 crc kubenswrapper[4747]: I1205 20:57:01.806863 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-prvhw"] Dec 05 20:57:01 crc kubenswrapper[4747]: W1205 20:57:01.814801 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604860ec_c0f5_4280_a534_798d3b88cf8e.slice/crio-b2c6d5904a10d073a65e8f9aab1b44839d3e7315bd37b5440694b5f9e381d3e0 WatchSource:0}: Error finding container b2c6d5904a10d073a65e8f9aab1b44839d3e7315bd37b5440694b5f9e381d3e0: Status 404 returned error can't find the container with id b2c6d5904a10d073a65e8f9aab1b44839d3e7315bd37b5440694b5f9e381d3e0 Dec 05 20:57:02 crc kubenswrapper[4747]: I1205 20:57:02.664278 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-prvhw" event={"ID":"604860ec-c0f5-4280-a534-798d3b88cf8e","Type":"ContainerStarted","Data":"3b68304e922592aa0824ccad984ee78befbe1eec35f089ec5777c4241933287d"} Dec 05 20:57:02 crc kubenswrapper[4747]: I1205 20:57:02.664837 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-prvhw" event={"ID":"604860ec-c0f5-4280-a534-798d3b88cf8e","Type":"ContainerStarted","Data":"b2c6d5904a10d073a65e8f9aab1b44839d3e7315bd37b5440694b5f9e381d3e0"} Dec 05 20:57:02 crc kubenswrapper[4747]: I1205 20:57:02.684152 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-prvhw" podStartSLOduration=2.6841308870000002 podStartE2EDuration="2.684130887s" podCreationTimestamp="2025-12-05 20:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:57:02.681145141 +0000 UTC m=+893.148452639" watchObservedRunningTime="2025-12-05 20:57:02.684130887 +0000 UTC m=+893.151438385" Dec 05 20:57:03 crc kubenswrapper[4747]: I1205 20:57:03.296143 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-zltlr" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.221949 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.222716 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.566266 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.567236 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.570390 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.570802 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.570945 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fsvsb" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.577627 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.753422 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnl4\" (UniqueName: \"kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4\") pod \"openstack-operator-index-z4s9f\" (UID: \"49804350-553a-412b-971c-c0d16cc83bc2\") " pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.855422 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnl4\" (UniqueName: \"kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4\") pod \"openstack-operator-index-z4s9f\" (UID: \"49804350-553a-412b-971c-c0d16cc83bc2\") " pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.879726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnl4\" (UniqueName: \"kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4\") pod \"openstack-operator-index-z4s9f\" (UID: \"49804350-553a-412b-971c-c0d16cc83bc2\") " pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:06 crc kubenswrapper[4747]: I1205 20:57:06.902978 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:07 crc kubenswrapper[4747]: I1205 20:57:07.399783 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:07 crc kubenswrapper[4747]: W1205 20:57:07.416403 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49804350_553a_412b_971c_c0d16cc83bc2.slice/crio-2cdc2e801b5bf1cb184c9037a10bbb3667080b9cda3d1fe6657695c00c460c8b WatchSource:0}: Error finding container 2cdc2e801b5bf1cb184c9037a10bbb3667080b9cda3d1fe6657695c00c460c8b: Status 404 returned error can't find the container with id 2cdc2e801b5bf1cb184c9037a10bbb3667080b9cda3d1fe6657695c00c460c8b Dec 05 20:57:07 crc kubenswrapper[4747]: I1205 20:57:07.712660 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z4s9f" event={"ID":"49804350-553a-412b-971c-c0d16cc83bc2","Type":"ContainerStarted","Data":"2cdc2e801b5bf1cb184c9037a10bbb3667080b9cda3d1fe6657695c00c460c8b"} Dec 05 20:57:08 crc kubenswrapper[4747]: I1205 20:57:08.721493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z4s9f" event={"ID":"49804350-553a-412b-971c-c0d16cc83bc2","Type":"ContainerStarted","Data":"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916"} Dec 05 20:57:08 crc kubenswrapper[4747]: I1205 20:57:08.742218 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z4s9f" podStartSLOduration=1.950020639 podStartE2EDuration="2.74219166s" podCreationTimestamp="2025-12-05 20:57:06 +0000 UTC" firstStartedPulling="2025-12-05 20:57:07.419542688 +0000 UTC m=+897.886850186" lastFinishedPulling="2025-12-05 20:57:08.211713719 +0000 UTC m=+898.679021207" observedRunningTime="2025-12-05 20:57:08.740410965 +0000 UTC m=+899.207718493" watchObservedRunningTime="2025-12-05 20:57:08.74219166 +0000 UTC m=+899.209499168" Dec 05 20:57:10 crc kubenswrapper[4747]: I1205 20:57:10.333361 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:10 crc kubenswrapper[4747]: I1205 20:57:10.732271 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-z4s9f" podUID="49804350-553a-412b-971c-c0d16cc83bc2" containerName="registry-server" containerID="cri-o://6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916" gracePeriod=2 Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.147846 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kk22k"] Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.149293 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.163961 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kk22k"] Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.240572 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.336108 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8bd\" (UniqueName: \"kubernetes.io/projected/466f7493-21f0-40e0-addb-35e631da4792-kube-api-access-zj8bd\") pod \"openstack-operator-index-kk22k\" (UID: \"466f7493-21f0-40e0-addb-35e631da4792\") " pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.437147 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcnl4\" (UniqueName: \"kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4\") pod \"49804350-553a-412b-971c-c0d16cc83bc2\" (UID: \"49804350-553a-412b-971c-c0d16cc83bc2\") " Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.437346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8bd\" (UniqueName: \"kubernetes.io/projected/466f7493-21f0-40e0-addb-35e631da4792-kube-api-access-zj8bd\") pod \"openstack-operator-index-kk22k\" (UID: \"466f7493-21f0-40e0-addb-35e631da4792\") " pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.444721 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4" (OuterVolumeSpecName: "kube-api-access-hcnl4") pod "49804350-553a-412b-971c-c0d16cc83bc2" (UID: "49804350-553a-412b-971c-c0d16cc83bc2"). InnerVolumeSpecName "kube-api-access-hcnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.454906 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8bd\" (UniqueName: \"kubernetes.io/projected/466f7493-21f0-40e0-addb-35e631da4792-kube-api-access-zj8bd\") pod \"openstack-operator-index-kk22k\" (UID: \"466f7493-21f0-40e0-addb-35e631da4792\") " pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.471251 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.538457 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcnl4\" (UniqueName: \"kubernetes.io/projected/49804350-553a-412b-971c-c0d16cc83bc2-kube-api-access-hcnl4\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.738133 4747 generic.go:334] "Generic (PLEG): container finished" podID="49804350-553a-412b-971c-c0d16cc83bc2" containerID="6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916" exitCode=0 Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.738297 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z4s9f" event={"ID":"49804350-553a-412b-971c-c0d16cc83bc2","Type":"ContainerDied","Data":"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916"} Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.738499 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z4s9f" event={"ID":"49804350-553a-412b-971c-c0d16cc83bc2","Type":"ContainerDied","Data":"2cdc2e801b5bf1cb184c9037a10bbb3667080b9cda3d1fe6657695c00c460c8b"} Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.738519 4747 scope.go:117] "RemoveContainer" containerID="6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.738366 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z4s9f" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.756462 4747 scope.go:117] "RemoveContainer" containerID="6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916" Dec 05 20:57:11 crc kubenswrapper[4747]: E1205 20:57:11.756873 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916\": container with ID starting with 6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916 not found: ID does not exist" containerID="6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.756904 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916"} err="failed to get container status \"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916\": rpc error: code = NotFound desc = could not find container \"6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916\": container with ID starting with 6ea3a15d408f461f7930ab0e653c3da464cfa3aa1da848cba16502eaa3d7d916 not found: ID does not exist" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.767920 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.772885 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-z4s9f"] Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.846390 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49804350-553a-412b-971c-c0d16cc83bc2" path="/var/lib/kubelet/pods/49804350-553a-412b-971c-c0d16cc83bc2/volumes" Dec 05 20:57:11 crc kubenswrapper[4747]: I1205 20:57:11.858839 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kk22k"] Dec 05 20:57:11 crc kubenswrapper[4747]: W1205 20:57:11.867594 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod466f7493_21f0_40e0_addb_35e631da4792.slice/crio-af4f5a2b9a0972112a649148e3fedcb825c25e16151474d3865507bc8bd76e67 WatchSource:0}: Error finding container af4f5a2b9a0972112a649148e3fedcb825c25e16151474d3865507bc8bd76e67: Status 404 returned error can't find the container with id af4f5a2b9a0972112a649148e3fedcb825c25e16151474d3865507bc8bd76e67 Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.751135 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk22k" event={"ID":"466f7493-21f0-40e0-addb-35e631da4792","Type":"ContainerStarted","Data":"5410c9a8ff6b582041d57366e8990e864d3814b0a6d0dab6a79b4db6492ef840"} Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.751880 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk22k" event={"ID":"466f7493-21f0-40e0-addb-35e631da4792","Type":"ContainerStarted","Data":"af4f5a2b9a0972112a649148e3fedcb825c25e16151474d3865507bc8bd76e67"} Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.770809 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kk22k" podStartSLOduration=1.388280316 podStartE2EDuration="1.770781441s" podCreationTimestamp="2025-12-05 20:57:11 +0000 UTC" firstStartedPulling="2025-12-05 20:57:11.870281803 +0000 UTC m=+902.337589291" lastFinishedPulling="2025-12-05 20:57:12.252782908 +0000 UTC m=+902.720090416" observedRunningTime="2025-12-05 20:57:12.765168578 +0000 UTC m=+903.232476056" watchObservedRunningTime="2025-12-05 20:57:12.770781441 +0000 UTC m=+903.238088949" Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.945953 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:12 crc kubenswrapper[4747]: E1205 20:57:12.946302 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49804350-553a-412b-971c-c0d16cc83bc2" containerName="registry-server" Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.946334 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="49804350-553a-412b-971c-c0d16cc83bc2" containerName="registry-server" Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.946534 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="49804350-553a-412b-971c-c0d16cc83bc2" containerName="registry-server" Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.947934 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:12 crc kubenswrapper[4747]: I1205 20:57:12.961805 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.056511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.057052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qrwq\" (UniqueName: \"kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.057134 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.158675 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.158773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qrwq\" (UniqueName: \"kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.158877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.159666 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.159770 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.188789 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qrwq\" (UniqueName: \"kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq\") pod \"certified-operators-vc8j2\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.265179 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:13 crc kubenswrapper[4747]: I1205 20:57:13.815084 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:13 crc kubenswrapper[4747]: W1205 20:57:13.823551 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab5854d_f62b_454e_98c9_7dad255e933d.slice/crio-dcaafbf77a67f98919a5ee572859a0ad012ad04d470f234db09f659d967f6aca WatchSource:0}: Error finding container dcaafbf77a67f98919a5ee572859a0ad012ad04d470f234db09f659d967f6aca: Status 404 returned error can't find the container with id dcaafbf77a67f98919a5ee572859a0ad012ad04d470f234db09f659d967f6aca Dec 05 20:57:14 crc kubenswrapper[4747]: I1205 20:57:14.765892 4747 generic.go:334] "Generic (PLEG): container finished" podID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerID="41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea" exitCode=0 Dec 05 20:57:14 crc kubenswrapper[4747]: I1205 20:57:14.766003 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerDied","Data":"41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea"} Dec 05 20:57:14 crc kubenswrapper[4747]: I1205 20:57:14.766272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerStarted","Data":"dcaafbf77a67f98919a5ee572859a0ad012ad04d470f234db09f659d967f6aca"} Dec 05 20:57:15 crc kubenswrapper[4747]: I1205 20:57:15.775757 4747 generic.go:334] "Generic (PLEG): container finished" podID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerID="8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0" exitCode=0 Dec 05 20:57:15 crc kubenswrapper[4747]: I1205 20:57:15.775856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerDied","Data":"8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0"} Dec 05 20:57:16 crc kubenswrapper[4747]: I1205 20:57:16.785431 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerStarted","Data":"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106"} Dec 05 20:57:16 crc kubenswrapper[4747]: I1205 20:57:16.809246 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vc8j2" podStartSLOduration=3.423460774 podStartE2EDuration="4.809227352s" podCreationTimestamp="2025-12-05 20:57:12 +0000 UTC" firstStartedPulling="2025-12-05 20:57:14.767600892 +0000 UTC m=+905.234908380" lastFinishedPulling="2025-12-05 20:57:16.15336747 +0000 UTC m=+906.620674958" observedRunningTime="2025-12-05 20:57:16.805189009 +0000 UTC m=+907.272496497" watchObservedRunningTime="2025-12-05 20:57:16.809227352 +0000 UTC m=+907.276534840" Dec 05 20:57:21 crc kubenswrapper[4747]: I1205 20:57:21.471868 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:21 crc kubenswrapper[4747]: I1205 20:57:21.472292 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:21 crc kubenswrapper[4747]: I1205 20:57:21.502892 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:21 crc kubenswrapper[4747]: I1205 20:57:21.852184 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kk22k" Dec 05 20:57:23 crc kubenswrapper[4747]: I1205 20:57:23.265797 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:23 crc kubenswrapper[4747]: I1205 20:57:23.265861 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:23 crc kubenswrapper[4747]: I1205 20:57:23.329389 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:23 crc kubenswrapper[4747]: I1205 20:57:23.868668 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.205725 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp"] Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.207417 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.212520 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-q78dl" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.224970 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp"] Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.364478 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjwv\" (UniqueName: \"kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.364614 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.364737 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.466181 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.466800 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjwv\" (UniqueName: \"kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.467093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.467158 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.467778 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.492931 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjwv\" (UniqueName: \"kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.527229 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:25 crc kubenswrapper[4747]: I1205 20:57:25.971342 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp"] Dec 05 20:57:26 crc kubenswrapper[4747]: I1205 20:57:26.533979 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:26 crc kubenswrapper[4747]: I1205 20:57:26.534571 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vc8j2" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="registry-server" containerID="cri-o://a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106" gracePeriod=2 Dec 05 20:57:26 crc kubenswrapper[4747]: I1205 20:57:26.854136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" event={"ID":"0bea60ba-9571-4b9e-8484-5da33bd67047","Type":"ContainerStarted","Data":"9854c956b14a5b3b33450b0be30e74edf7130f8b143f5105a907899a9a30715e"} Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.550378 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.696516 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content\") pod \"0ab5854d-f62b-454e-98c9-7dad255e933d\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.696687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qrwq\" (UniqueName: \"kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq\") pod \"0ab5854d-f62b-454e-98c9-7dad255e933d\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.696781 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities\") pod \"0ab5854d-f62b-454e-98c9-7dad255e933d\" (UID: \"0ab5854d-f62b-454e-98c9-7dad255e933d\") " Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.697835 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities" (OuterVolumeSpecName: "utilities") pod "0ab5854d-f62b-454e-98c9-7dad255e933d" (UID: "0ab5854d-f62b-454e-98c9-7dad255e933d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.706814 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq" (OuterVolumeSpecName: "kube-api-access-5qrwq") pod "0ab5854d-f62b-454e-98c9-7dad255e933d" (UID: "0ab5854d-f62b-454e-98c9-7dad255e933d"). InnerVolumeSpecName "kube-api-access-5qrwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.760320 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ab5854d-f62b-454e-98c9-7dad255e933d" (UID: "0ab5854d-f62b-454e-98c9-7dad255e933d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.798768 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qrwq\" (UniqueName: \"kubernetes.io/projected/0ab5854d-f62b-454e-98c9-7dad255e933d-kube-api-access-5qrwq\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.798826 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.798839 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5854d-f62b-454e-98c9-7dad255e933d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.861323 4747 generic.go:334] "Generic (PLEG): container finished" podID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerID="a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106" exitCode=0 Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.861400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerDied","Data":"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106"} Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.861417 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc8j2" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.861458 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc8j2" event={"ID":"0ab5854d-f62b-454e-98c9-7dad255e933d","Type":"ContainerDied","Data":"dcaafbf77a67f98919a5ee572859a0ad012ad04d470f234db09f659d967f6aca"} Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.861482 4747 scope.go:117] "RemoveContainer" containerID="a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.863239 4747 generic.go:334] "Generic (PLEG): container finished" podID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerID="b27c854a80483627da0b6a1919108048a9d0ffaa378eeae2df8732fd1bffbf20" exitCode=0 Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.863269 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" event={"ID":"0bea60ba-9571-4b9e-8484-5da33bd67047","Type":"ContainerDied","Data":"b27c854a80483627da0b6a1919108048a9d0ffaa378eeae2df8732fd1bffbf20"} Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.877797 4747 scope.go:117] "RemoveContainer" containerID="8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.907621 4747 scope.go:117] "RemoveContainer" containerID="41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.921100 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.925612 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vc8j2"] Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.935945 4747 scope.go:117] "RemoveContainer" containerID="a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106" Dec 05 20:57:27 crc kubenswrapper[4747]: E1205 20:57:27.936470 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106\": container with ID starting with a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106 not found: ID does not exist" containerID="a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.936545 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106"} err="failed to get container status \"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106\": rpc error: code = NotFound desc = could not find container \"a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106\": container with ID starting with a49bd182b8fe627202615343cd9eab2f57db320d3faeef5ffd1a539cd8783106 not found: ID does not exist" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.936618 4747 scope.go:117] "RemoveContainer" containerID="8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0" Dec 05 20:57:27 crc kubenswrapper[4747]: E1205 20:57:27.937027 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0\": container with ID starting with 8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0 not found: ID does not exist" containerID="8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.937060 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0"} err="failed to get container status \"8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0\": rpc error: code = NotFound desc = could not find container \"8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0\": container with ID starting with 8e77d88907eb953c7d6a2eca6356e3390df14f5df9ae74595e36f55c7350c5b0 not found: ID does not exist" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.937084 4747 scope.go:117] "RemoveContainer" containerID="41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea" Dec 05 20:57:27 crc kubenswrapper[4747]: E1205 20:57:27.937531 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea\": container with ID starting with 41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea not found: ID does not exist" containerID="41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea" Dec 05 20:57:27 crc kubenswrapper[4747]: I1205 20:57:27.937589 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea"} err="failed to get container status \"41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea\": rpc error: code = NotFound desc = could not find container \"41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea\": container with ID starting with 41633e2e1a1fd7206e4b465467fb055821bda2f00edcba5fe14bda4f3bd952ea not found: ID does not exist" Dec 05 20:57:28 crc kubenswrapper[4747]: I1205 20:57:28.873486 4747 generic.go:334] "Generic (PLEG): container finished" podID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerID="ca5600cb41bb62b9acaaabe89014945d5d55456c79253ffddfc4d9d455e2385b" exitCode=0 Dec 05 20:57:28 crc kubenswrapper[4747]: I1205 20:57:28.873857 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" event={"ID":"0bea60ba-9571-4b9e-8484-5da33bd67047","Type":"ContainerDied","Data":"ca5600cb41bb62b9acaaabe89014945d5d55456c79253ffddfc4d9d455e2385b"} Dec 05 20:57:29 crc kubenswrapper[4747]: I1205 20:57:29.851214 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" path="/var/lib/kubelet/pods/0ab5854d-f62b-454e-98c9-7dad255e933d/volumes" Dec 05 20:57:29 crc kubenswrapper[4747]: I1205 20:57:29.886562 4747 generic.go:334] "Generic (PLEG): container finished" podID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerID="0b61987e0522a11fb7a1ee13cf5b0792466724cb8dbf2228f25b33bdde27843f" exitCode=0 Dec 05 20:57:29 crc kubenswrapper[4747]: I1205 20:57:29.886632 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" event={"ID":"0bea60ba-9571-4b9e-8484-5da33bd67047","Type":"ContainerDied","Data":"0b61987e0522a11fb7a1ee13cf5b0792466724cb8dbf2228f25b33bdde27843f"} Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.189644 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.388876 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle\") pod \"0bea60ba-9571-4b9e-8484-5da33bd67047\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.389174 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpjwv\" (UniqueName: \"kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv\") pod \"0bea60ba-9571-4b9e-8484-5da33bd67047\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.389213 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util\") pod \"0bea60ba-9571-4b9e-8484-5da33bd67047\" (UID: \"0bea60ba-9571-4b9e-8484-5da33bd67047\") " Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.391511 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle" (OuterVolumeSpecName: "bundle") pod "0bea60ba-9571-4b9e-8484-5da33bd67047" (UID: "0bea60ba-9571-4b9e-8484-5da33bd67047"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.399179 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv" (OuterVolumeSpecName: "kube-api-access-lpjwv") pod "0bea60ba-9571-4b9e-8484-5da33bd67047" (UID: "0bea60ba-9571-4b9e-8484-5da33bd67047"). InnerVolumeSpecName "kube-api-access-lpjwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.412793 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util" (OuterVolumeSpecName: "util") pod "0bea60ba-9571-4b9e-8484-5da33bd67047" (UID: "0bea60ba-9571-4b9e-8484-5da33bd67047"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.490193 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.490245 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpjwv\" (UniqueName: \"kubernetes.io/projected/0bea60ba-9571-4b9e-8484-5da33bd67047-kube-api-access-lpjwv\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.490266 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bea60ba-9571-4b9e-8484-5da33bd67047-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.904321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" event={"ID":"0bea60ba-9571-4b9e-8484-5da33bd67047","Type":"ContainerDied","Data":"9854c956b14a5b3b33450b0be30e74edf7130f8b143f5105a907899a9a30715e"} Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.904384 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9854c956b14a5b3b33450b0be30e74edf7130f8b143f5105a907899a9a30715e" Dec 05 20:57:31 crc kubenswrapper[4747]: I1205 20:57:31.904395 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.222122 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.222691 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420290 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9"] Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420526 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="registry-server" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420541 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="registry-server" Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420557 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="extract-content" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420563 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="extract-content" Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420589 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="extract-utilities" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420596 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="extract-utilities" Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420606 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="util" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420613 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="util" Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420623 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="pull" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420629 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="pull" Dec 05 20:57:36 crc kubenswrapper[4747]: E1205 20:57:36.420636 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="extract" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420641 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="extract" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420747 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab5854d-f62b-454e-98c9-7dad255e933d" containerName="registry-server" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.420757 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bea60ba-9571-4b9e-8484-5da33bd67047" containerName="extract" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.421185 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.425381 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-l6bm2" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.456322 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9"] Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.470374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tztwl\" (UniqueName: \"kubernetes.io/projected/9a778748-0d28-41ea-9d92-ba2e95f46a80-kube-api-access-tztwl\") pod \"openstack-operator-controller-operator-55b6fb9447-mjqc9\" (UID: \"9a778748-0d28-41ea-9d92-ba2e95f46a80\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.571545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tztwl\" (UniqueName: \"kubernetes.io/projected/9a778748-0d28-41ea-9d92-ba2e95f46a80-kube-api-access-tztwl\") pod \"openstack-operator-controller-operator-55b6fb9447-mjqc9\" (UID: \"9a778748-0d28-41ea-9d92-ba2e95f46a80\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.611354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tztwl\" (UniqueName: \"kubernetes.io/projected/9a778748-0d28-41ea-9d92-ba2e95f46a80-kube-api-access-tztwl\") pod \"openstack-operator-controller-operator-55b6fb9447-mjqc9\" (UID: \"9a778748-0d28-41ea-9d92-ba2e95f46a80\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:36 crc kubenswrapper[4747]: I1205 20:57:36.740150 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:37 crc kubenswrapper[4747]: I1205 20:57:37.220743 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9"] Dec 05 20:57:37 crc kubenswrapper[4747]: I1205 20:57:37.963628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" event={"ID":"9a778748-0d28-41ea-9d92-ba2e95f46a80","Type":"ContainerStarted","Data":"4c10a610bb88569d0f5eca0d86b3c7f68affc95c7c7b1a023a43b80042db7d73"} Dec 05 20:57:43 crc kubenswrapper[4747]: I1205 20:57:43.016145 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" event={"ID":"9a778748-0d28-41ea-9d92-ba2e95f46a80","Type":"ContainerStarted","Data":"7c9333cc47a88bb8aebabddcac12f56cc4a1a74119f53cc0b73eed60946415ab"} Dec 05 20:57:43 crc kubenswrapper[4747]: I1205 20:57:43.016488 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:57:43 crc kubenswrapper[4747]: I1205 20:57:43.056932 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" podStartSLOduration=1.5649170030000001 podStartE2EDuration="7.056895934s" podCreationTimestamp="2025-12-05 20:57:36 +0000 UTC" firstStartedPulling="2025-12-05 20:57:37.226128464 +0000 UTC m=+927.693435962" lastFinishedPulling="2025-12-05 20:57:42.718107405 +0000 UTC m=+933.185414893" observedRunningTime="2025-12-05 20:57:43.040748225 +0000 UTC m=+933.508055773" watchObservedRunningTime="2025-12-05 20:57:43.056895934 +0000 UTC m=+933.524203462" Dec 05 20:57:56 crc kubenswrapper[4747]: I1205 20:57:56.744875 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-mjqc9" Dec 05 20:58:06 crc kubenswrapper[4747]: I1205 20:58:06.221640 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:58:06 crc kubenswrapper[4747]: I1205 20:58:06.222177 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:58:06 crc kubenswrapper[4747]: I1205 20:58:06.222218 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 20:58:06 crc kubenswrapper[4747]: I1205 20:58:06.222689 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:58:06 crc kubenswrapper[4747]: I1205 20:58:06.222737 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70" gracePeriod=600 Dec 05 20:58:07 crc kubenswrapper[4747]: I1205 20:58:07.200227 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70" exitCode=0 Dec 05 20:58:07 crc kubenswrapper[4747]: I1205 20:58:07.200305 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70"} Dec 05 20:58:07 crc kubenswrapper[4747]: I1205 20:58:07.200730 4747 scope.go:117] "RemoveContainer" containerID="ceb16b96a12723254ee4deb822f6ff1371e5e51859e0f470b05fcb9c79ec859d" Dec 05 20:58:08 crc kubenswrapper[4747]: I1205 20:58:08.210229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1"} Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.926565 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn"] Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.928366 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.930743 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-kzsrs" Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.931555 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8"] Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.932829 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.936374 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn"] Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.936952 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5z7dj" Dec 05 20:58:14 crc kubenswrapper[4747]: I1205 20:58:14.939865 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.000231 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.001284 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.007891 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zqpl8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.019767 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.041639 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.042296 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jft42\" (UniqueName: \"kubernetes.io/projected/376f6574-b930-4baa-98c4-d4b47f3b7e76-kube-api-access-jft42\") pod \"barbican-operator-controller-manager-7d9dfd778-p9kpn\" (UID: \"376f6574-b930-4baa-98c4-d4b47f3b7e76\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.042375 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjz7r\" (UniqueName: \"kubernetes.io/projected/4fa9efa6-f9bd-47d1-b27c-701f742537f8-kube-api-access-cjz7r\") pod \"cinder-operator-controller-manager-859b6ccc6-lgxh8\" (UID: \"4fa9efa6-f9bd-47d1-b27c-701f742537f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.042807 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.046310 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-95rzv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.048384 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.049408 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.052541 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wjv7h" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.062637 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.063958 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.070160 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qvtz7" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.078188 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.095808 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.124746 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh4nh\" (UniqueName: \"kubernetes.io/projected/ff9bc6be-0b60-4c4b-a692-5de429d22c0d-kube-api-access-kh4nh\") pod \"glance-operator-controller-manager-77987cd8cd-6qc7c\" (UID: \"ff9bc6be-0b60-4c4b-a692-5de429d22c0d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143443 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jft42\" (UniqueName: \"kubernetes.io/projected/376f6574-b930-4baa-98c4-d4b47f3b7e76-kube-api-access-jft42\") pod \"barbican-operator-controller-manager-7d9dfd778-p9kpn\" (UID: \"376f6574-b930-4baa-98c4-d4b47f3b7e76\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143490 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm68m\" (UniqueName: \"kubernetes.io/projected/af277afe-81a7-4ca7-8e88-dee226ed11a3-kube-api-access-jm68m\") pod \"horizon-operator-controller-manager-68c6d99b8f-tw5n8\" (UID: \"af277afe-81a7-4ca7-8e88-dee226ed11a3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv45z\" (UniqueName: \"kubernetes.io/projected/6b18b68e-5851-4b54-a953-aa0f9151f191-kube-api-access-lv45z\") pod \"designate-operator-controller-manager-78b4bc895b-lcwxh\" (UID: \"6b18b68e-5851-4b54-a953-aa0f9151f191\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjz7r\" (UniqueName: \"kubernetes.io/projected/4fa9efa6-f9bd-47d1-b27c-701f742537f8-kube-api-access-cjz7r\") pod \"cinder-operator-controller-manager-859b6ccc6-lgxh8\" (UID: \"4fa9efa6-f9bd-47d1-b27c-701f742537f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.143636 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jblhx\" (UniqueName: \"kubernetes.io/projected/60cd02a8-1ab2-4037-b5d8-e479875ce1db-kube-api-access-jblhx\") pod \"heat-operator-controller-manager-5f64f6f8bb-8c9z4\" (UID: \"60cd02a8-1ab2-4037-b5d8-e479875ce1db\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.156711 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.191869 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.192090 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.195075 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.195185 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.196669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jft42\" (UniqueName: \"kubernetes.io/projected/376f6574-b930-4baa-98c4-d4b47f3b7e76-kube-api-access-jft42\") pod \"barbican-operator-controller-manager-7d9dfd778-p9kpn\" (UID: \"376f6574-b930-4baa-98c4-d4b47f3b7e76\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.198692 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x4hq8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.198992 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rw6vc" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.199186 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.202763 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjz7r\" (UniqueName: \"kubernetes.io/projected/4fa9efa6-f9bd-47d1-b27c-701f742537f8-kube-api-access-cjz7r\") pod \"cinder-operator-controller-manager-859b6ccc6-lgxh8\" (UID: \"4fa9efa6-f9bd-47d1-b27c-701f742537f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.248810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm68m\" (UniqueName: \"kubernetes.io/projected/af277afe-81a7-4ca7-8e88-dee226ed11a3-kube-api-access-jm68m\") pod \"horizon-operator-controller-manager-68c6d99b8f-tw5n8\" (UID: \"af277afe-81a7-4ca7-8e88-dee226ed11a3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.248896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv45z\" (UniqueName: \"kubernetes.io/projected/6b18b68e-5851-4b54-a953-aa0f9151f191-kube-api-access-lv45z\") pod \"designate-operator-controller-manager-78b4bc895b-lcwxh\" (UID: \"6b18b68e-5851-4b54-a953-aa0f9151f191\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.248957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jblhx\" (UniqueName: \"kubernetes.io/projected/60cd02a8-1ab2-4037-b5d8-e479875ce1db-kube-api-access-jblhx\") pod \"heat-operator-controller-manager-5f64f6f8bb-8c9z4\" (UID: \"60cd02a8-1ab2-4037-b5d8-e479875ce1db\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.248990 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh4nh\" (UniqueName: \"kubernetes.io/projected/ff9bc6be-0b60-4c4b-a692-5de429d22c0d-kube-api-access-kh4nh\") pod \"glance-operator-controller-manager-77987cd8cd-6qc7c\" (UID: \"ff9bc6be-0b60-4c4b-a692-5de429d22c0d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.250294 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.281631 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.283423 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.283427 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv45z\" (UniqueName: \"kubernetes.io/projected/6b18b68e-5851-4b54-a953-aa0f9151f191-kube-api-access-lv45z\") pod \"designate-operator-controller-manager-78b4bc895b-lcwxh\" (UID: \"6b18b68e-5851-4b54-a953-aa0f9151f191\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.290019 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qmlcx" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.298171 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.306237 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh4nh\" (UniqueName: \"kubernetes.io/projected/ff9bc6be-0b60-4c4b-a692-5de429d22c0d-kube-api-access-kh4nh\") pod \"glance-operator-controller-manager-77987cd8cd-6qc7c\" (UID: \"ff9bc6be-0b60-4c4b-a692-5de429d22c0d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.312329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm68m\" (UniqueName: \"kubernetes.io/projected/af277afe-81a7-4ca7-8e88-dee226ed11a3-kube-api-access-jm68m\") pod \"horizon-operator-controller-manager-68c6d99b8f-tw5n8\" (UID: \"af277afe-81a7-4ca7-8e88-dee226ed11a3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.317680 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.332971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.351144 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7t9v\" (UniqueName: \"kubernetes.io/projected/4157fb94-c6ec-4eba-9020-50efd318640f-kube-api-access-m7t9v\") pod \"keystone-operator-controller-manager-7765d96ddf-s2xls\" (UID: \"4157fb94-c6ec-4eba-9020-50efd318640f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.351236 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6m99\" (UniqueName: \"kubernetes.io/projected/0656c174-0497-42ca-9abe-b7c50e82bdec-kube-api-access-h6m99\") pod \"ironic-operator-controller-manager-6c548fd776-llksb\" (UID: \"0656c174-0497-42ca-9abe-b7c50e82bdec\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.351286 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtppp\" (UniqueName: \"kubernetes.io/projected/2cb7e739-bfdd-4785-98bc-88c011e1f703-kube-api-access-rtppp\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.351324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.354348 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.370165 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.371421 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jblhx\" (UniqueName: \"kubernetes.io/projected/60cd02a8-1ab2-4037-b5d8-e479875ce1db-kube-api-access-jblhx\") pod \"heat-operator-controller-manager-5f64f6f8bb-8c9z4\" (UID: \"60cd02a8-1ab2-4037-b5d8-e479875ce1db\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.384139 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.396326 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.399662 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.400901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.409466 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4skm7" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.447787 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.449295 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.456245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7t9v\" (UniqueName: \"kubernetes.io/projected/4157fb94-c6ec-4eba-9020-50efd318640f-kube-api-access-m7t9v\") pod \"keystone-operator-controller-manager-7765d96ddf-s2xls\" (UID: \"4157fb94-c6ec-4eba-9020-50efd318640f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.456314 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6m99\" (UniqueName: \"kubernetes.io/projected/0656c174-0497-42ca-9abe-b7c50e82bdec-kube-api-access-h6m99\") pod \"ironic-operator-controller-manager-6c548fd776-llksb\" (UID: \"0656c174-0497-42ca-9abe-b7c50e82bdec\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.456352 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtppp\" (UniqueName: \"kubernetes.io/projected/2cb7e739-bfdd-4785-98bc-88c011e1f703-kube-api-access-rtppp\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.456388 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.456514 4747 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.456561 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert podName:2cb7e739-bfdd-4785-98bc-88c011e1f703 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:15.956542532 +0000 UTC m=+966.423850020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert") pod "infra-operator-controller-manager-57548d458d-n9bsq" (UID: "2cb7e739-bfdd-4785-98bc-88c011e1f703") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.474718 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.481449 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f2mnj" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.490185 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.491608 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.497529 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.498789 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-66bg9" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.500420 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6m99\" (UniqueName: \"kubernetes.io/projected/0656c174-0497-42ca-9abe-b7c50e82bdec-kube-api-access-h6m99\") pod \"ironic-operator-controller-manager-6c548fd776-llksb\" (UID: \"0656c174-0497-42ca-9abe-b7c50e82bdec\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.508374 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtppp\" (UniqueName: \"kubernetes.io/projected/2cb7e739-bfdd-4785-98bc-88c011e1f703-kube-api-access-rtppp\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.509463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7t9v\" (UniqueName: \"kubernetes.io/projected/4157fb94-c6ec-4eba-9020-50efd318640f-kube-api-access-m7t9v\") pod \"keystone-operator-controller-manager-7765d96ddf-s2xls\" (UID: \"4157fb94-c6ec-4eba-9020-50efd318640f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.529896 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.557324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-458bm\" (UniqueName: \"kubernetes.io/projected/54d11a3d-e575-4a91-8856-97ab3f4adb1c-kube-api-access-458bm\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f7fp2\" (UID: \"54d11a3d-e575-4a91-8856-97ab3f4adb1c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.557418 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6msp\" (UniqueName: \"kubernetes.io/projected/0d3e833b-6977-47a1-8ba3-b09d093f303c-kube-api-access-z6msp\") pod \"manila-operator-controller-manager-7c79b5df47-7lrth\" (UID: \"0d3e833b-6977-47a1-8ba3-b09d093f303c\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.557451 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frkkz\" (UniqueName: \"kubernetes.io/projected/391b5a9d-822c-4980-9eb0-376dd5d44126-kube-api-access-frkkz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lc7bg\" (UID: \"391b5a9d-822c-4980-9eb0-376dd5d44126\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.561511 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.563959 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.567364 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.568806 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-m4rck" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.569482 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7k88v"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.578855 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.589572 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7k88v"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.589666 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.594816 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f8wf5" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.662935 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2nj\" (UniqueName: \"kubernetes.io/projected/3659825b-dd9f-40b9-a17a-3c931805fe9c-kube-api-access-rr2nj\") pod \"nova-operator-controller-manager-697bc559fc-42pxn\" (UID: \"3659825b-dd9f-40b9-a17a-3c931805fe9c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.662975 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6msp\" (UniqueName: \"kubernetes.io/projected/0d3e833b-6977-47a1-8ba3-b09d093f303c-kube-api-access-z6msp\") pod \"manila-operator-controller-manager-7c79b5df47-7lrth\" (UID: \"0d3e833b-6977-47a1-8ba3-b09d093f303c\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.663006 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frkkz\" (UniqueName: \"kubernetes.io/projected/391b5a9d-822c-4980-9eb0-376dd5d44126-kube-api-access-frkkz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lc7bg\" (UID: \"391b5a9d-822c-4980-9eb0-376dd5d44126\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.663034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgnh8\" (UniqueName: \"kubernetes.io/projected/cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd-kube-api-access-vgnh8\") pod \"octavia-operator-controller-manager-998648c74-7k88v\" (UID: \"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.663073 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-458bm\" (UniqueName: \"kubernetes.io/projected/54d11a3d-e575-4a91-8856-97ab3f4adb1c-kube-api-access-458bm\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f7fp2\" (UID: \"54d11a3d-e575-4a91-8856-97ab3f4adb1c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.680221 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.702048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frkkz\" (UniqueName: \"kubernetes.io/projected/391b5a9d-822c-4980-9eb0-376dd5d44126-kube-api-access-frkkz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lc7bg\" (UID: \"391b5a9d-822c-4980-9eb0-376dd5d44126\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.702367 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.702628 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6msp\" (UniqueName: \"kubernetes.io/projected/0d3e833b-6977-47a1-8ba3-b09d093f303c-kube-api-access-z6msp\") pod \"manila-operator-controller-manager-7c79b5df47-7lrth\" (UID: \"0d3e833b-6977-47a1-8ba3-b09d093f303c\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.707235 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.725627 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.731554 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.734890 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-h2zsh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.735812 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.741643 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-m9jdj" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.750093 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.750403 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.757044 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.765898 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.768380 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.770376 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2nj\" (UniqueName: \"kubernetes.io/projected/3659825b-dd9f-40b9-a17a-3c931805fe9c-kube-api-access-rr2nj\") pod \"nova-operator-controller-manager-697bc559fc-42pxn\" (UID: \"3659825b-dd9f-40b9-a17a-3c931805fe9c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.770511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xsr\" (UniqueName: \"kubernetes.io/projected/cb5a89d9-1303-4cbc-9641-8be514157ed0-kube-api-access-l7xsr\") pod \"ovn-operator-controller-manager-b6456fdb6-kz2bv\" (UID: \"cb5a89d9-1303-4cbc-9641-8be514157ed0\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.770626 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgnh8\" (UniqueName: \"kubernetes.io/projected/cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd-kube-api-access-vgnh8\") pod \"octavia-operator-controller-manager-998648c74-7k88v\" (UID: \"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.770666 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwdc\" (UniqueName: \"kubernetes.io/projected/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-kube-api-access-smwdc\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.770767 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.773453 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-458bm\" (UniqueName: \"kubernetes.io/projected/54d11a3d-e575-4a91-8856-97ab3f4adb1c-kube-api-access-458bm\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f7fp2\" (UID: \"54d11a3d-e575-4a91-8856-97ab3f4adb1c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.773739 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zvbfh" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.792672 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.794117 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.797137 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mcql4" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.799842 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.815506 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.816973 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg"] Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.821205 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.822470 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgnh8\" (UniqueName: \"kubernetes.io/projected/cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd-kube-api-access-vgnh8\") pod \"octavia-operator-controller-manager-998648c74-7k88v\" (UID: \"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.824388 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9nv4c" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.825098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2nj\" (UniqueName: \"kubernetes.io/projected/3659825b-dd9f-40b9-a17a-3c931805fe9c-kube-api-access-rr2nj\") pod \"nova-operator-controller-manager-697bc559fc-42pxn\" (UID: \"3659825b-dd9f-40b9-a17a-3c931805fe9c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.865481 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.876973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.877129 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmfrc\" (UniqueName: \"kubernetes.io/projected/f46e2df3-4e49-4818-91d9-1181d78e46f9-kube-api-access-kmfrc\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hjzcg\" (UID: \"f46e2df3-4e49-4818-91d9-1181d78e46f9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.877191 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxbt\" (UniqueName: \"kubernetes.io/projected/44d3f632-8fb0-4763-adaa-0f54fdb8386f-kube-api-access-5wxbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-ml4vm\" (UID: \"44d3f632-8fb0-4763-adaa-0f54fdb8386f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.877225 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbhw2\" (UniqueName: \"kubernetes.io/projected/fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4-kube-api-access-xbhw2\") pod \"placement-operator-controller-manager-78f8948974-8ltc2\" (UID: \"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.877271 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xsr\" (UniqueName: \"kubernetes.io/projected/cb5a89d9-1303-4cbc-9641-8be514157ed0-kube-api-access-l7xsr\") pod \"ovn-operator-controller-manager-b6456fdb6-kz2bv\" (UID: \"cb5a89d9-1303-4cbc-9641-8be514157ed0\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.877519 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwdc\" (UniqueName: \"kubernetes.io/projected/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-kube-api-access-smwdc\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.879140 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.879228 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert podName:9438f8d5-2c04-4bd1-9e4b-cc29d4565085 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:16.379191928 +0000 UTC m=+966.846499416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" (UID: "9438f8d5-2c04-4bd1-9e4b-cc29d4565085") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.909290 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xsr\" (UniqueName: \"kubernetes.io/projected/cb5a89d9-1303-4cbc-9641-8be514157ed0-kube-api-access-l7xsr\") pod \"ovn-operator-controller-manager-b6456fdb6-kz2bv\" (UID: \"cb5a89d9-1303-4cbc-9641-8be514157ed0\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.916019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwdc\" (UniqueName: \"kubernetes.io/projected/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-kube-api-access-smwdc\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.917128 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.937952 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.972011 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.978683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxbt\" (UniqueName: \"kubernetes.io/projected/44d3f632-8fb0-4763-adaa-0f54fdb8386f-kube-api-access-5wxbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-ml4vm\" (UID: \"44d3f632-8fb0-4763-adaa-0f54fdb8386f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.978714 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbhw2\" (UniqueName: \"kubernetes.io/projected/fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4-kube-api-access-xbhw2\") pod \"placement-operator-controller-manager-78f8948974-8ltc2\" (UID: \"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.978822 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.978848 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmfrc\" (UniqueName: \"kubernetes.io/projected/f46e2df3-4e49-4818-91d9-1181d78e46f9-kube-api-access-kmfrc\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hjzcg\" (UID: \"f46e2df3-4e49-4818-91d9-1181d78e46f9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.979667 4747 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: E1205 20:58:15.979722 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert podName:2cb7e739-bfdd-4785-98bc-88c011e1f703 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:16.979705086 +0000 UTC m=+967.447012574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert") pod "infra-operator-controller-manager-57548d458d-n9bsq" (UID: "2cb7e739-bfdd-4785-98bc-88c011e1f703") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:15 crc kubenswrapper[4747]: I1205 20:58:15.991921 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.011791 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbhw2\" (UniqueName: \"kubernetes.io/projected/fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4-kube-api-access-xbhw2\") pod \"placement-operator-controller-manager-78f8948974-8ltc2\" (UID: \"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.019351 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxbt\" (UniqueName: \"kubernetes.io/projected/44d3f632-8fb0-4763-adaa-0f54fdb8386f-kube-api-access-5wxbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-ml4vm\" (UID: \"44d3f632-8fb0-4763-adaa-0f54fdb8386f\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.019381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmfrc\" (UniqueName: \"kubernetes.io/projected/f46e2df3-4e49-4818-91d9-1181d78e46f9-kube-api-access-kmfrc\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hjzcg\" (UID: \"f46e2df3-4e49-4818-91d9-1181d78e46f9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.046726 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.049972 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.050020 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.051449 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.051505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.051563 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.053876 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-27bzz" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.059001 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.059043 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.060528 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.066198 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.066910 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hbvnl" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.077337 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.081839 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9wm52" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.082118 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.082229 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.082328 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.102593 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.103710 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.115100 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7h5zn" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.130532 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187166 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpk54\" (UniqueName: \"kubernetes.io/projected/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-kube-api-access-mpk54\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187211 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbtv\" (UniqueName: \"kubernetes.io/projected/0fbf81b5-1f99-402f-b486-0801f63077e4-kube-api-access-cqbtv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7mb56\" (UID: \"0fbf81b5-1f99-402f-b486-0801f63077e4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187240 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7ps\" (UniqueName: \"kubernetes.io/projected/be975bb9-2086-4fde-8c71-9c7f56eaf8ea-kube-api-access-rx7ps\") pod \"test-operator-controller-manager-5854674fcc-dvpxs\" (UID: \"be975bb9-2086-4fde-8c71-9c7f56eaf8ea\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187273 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6bs\" (UniqueName: \"kubernetes.io/projected/4ac7a709-1eb7-4f8f-ac46-3a404b2171a7-kube-api-access-qh6bs\") pod \"watcher-operator-controller-manager-769dc69bc-8mfrt\" (UID: \"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187380 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.187412 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288386 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6bs\" (UniqueName: \"kubernetes.io/projected/4ac7a709-1eb7-4f8f-ac46-3a404b2171a7-kube-api-access-qh6bs\") pod \"watcher-operator-controller-manager-769dc69bc-8mfrt\" (UID: \"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288491 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288511 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288538 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpk54\" (UniqueName: \"kubernetes.io/projected/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-kube-api-access-mpk54\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288555 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbtv\" (UniqueName: \"kubernetes.io/projected/0fbf81b5-1f99-402f-b486-0801f63077e4-kube-api-access-cqbtv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7mb56\" (UID: \"0fbf81b5-1f99-402f-b486-0801f63077e4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.288574 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7ps\" (UniqueName: \"kubernetes.io/projected/be975bb9-2086-4fde-8c71-9c7f56eaf8ea-kube-api-access-rx7ps\") pod \"test-operator-controller-manager-5854674fcc-dvpxs\" (UID: \"be975bb9-2086-4fde-8c71-9c7f56eaf8ea\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.289061 4747 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.289101 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:16.789087098 +0000 UTC m=+967.256394586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "metrics-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.289229 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.289250 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:16.789243692 +0000 UTC m=+967.256551180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.309335 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.340107 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6bs\" (UniqueName: \"kubernetes.io/projected/4ac7a709-1eb7-4f8f-ac46-3a404b2171a7-kube-api-access-qh6bs\") pod \"watcher-operator-controller-manager-769dc69bc-8mfrt\" (UID: \"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.340157 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7ps\" (UniqueName: \"kubernetes.io/projected/be975bb9-2086-4fde-8c71-9c7f56eaf8ea-kube-api-access-rx7ps\") pod \"test-operator-controller-manager-5854674fcc-dvpxs\" (UID: \"be975bb9-2086-4fde-8c71-9c7f56eaf8ea\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.340631 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpk54\" (UniqueName: \"kubernetes.io/projected/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-kube-api-access-mpk54\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.354489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbtv\" (UniqueName: \"kubernetes.io/projected/0fbf81b5-1f99-402f-b486-0801f63077e4-kube-api-access-cqbtv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7mb56\" (UID: \"0fbf81b5-1f99-402f-b486-0801f63077e4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.365731 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.389687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.389861 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.389929 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert podName:9438f8d5-2c04-4bd1-9e4b-cc29d4565085 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:17.389913013 +0000 UTC m=+967.857220501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" (UID: "9438f8d5-2c04-4bd1-9e4b-cc29d4565085") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.399266 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.409489 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn"] Dec 05 20:58:16 crc kubenswrapper[4747]: W1205 20:58:16.460755 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fa9efa6_f9bd_47d1_b27c_701f742537f8.slice/crio-21774b30dfbb3c1ecbd6d8c2fd273551986bbd440ec8f6e27943cfe50799647a WatchSource:0}: Error finding container 21774b30dfbb3c1ecbd6d8c2fd273551986bbd440ec8f6e27943cfe50799647a: Status 404 returned error can't find the container with id 21774b30dfbb3c1ecbd6d8c2fd273551986bbd440ec8f6e27943cfe50799647a Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.465802 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.507439 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.800725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.801070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.800899 4747 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.801164 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:17.801137775 +0000 UTC m=+968.268445263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "metrics-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.801217 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: E1205 20:58:16.801254 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:17.801242667 +0000 UTC m=+968.268550155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "webhook-server-cert" not found Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.818163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.833853 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.856341 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.870062 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c"] Dec 05 20:58:16 crc kubenswrapper[4747]: W1205 20:58:16.870530 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b18b68e_5851_4b54_a953_aa0f9151f191.slice/crio-391fe0ccecef3aafe4306bf33eacd8bc21c740ca53b523d6d19c91922fd6ef0c WatchSource:0}: Error finding container 391fe0ccecef3aafe4306bf33eacd8bc21c740ca53b523d6d19c91922fd6ef0c: Status 404 returned error can't find the container with id 391fe0ccecef3aafe4306bf33eacd8bc21c740ca53b523d6d19c91922fd6ef0c Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.880322 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.888280 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4"] Dec 05 20:58:16 crc kubenswrapper[4747]: I1205 20:58:16.984480 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2"] Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.001269 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls"] Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.007776 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.007944 4747 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.008000 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert podName:2cb7e739-bfdd-4785-98bc-88c011e1f703 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:19.007983899 +0000 UTC m=+969.475291387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert") pod "infra-operator-controller-manager-57548d458d-n9bsq" (UID: "2cb7e739-bfdd-4785-98bc-88c011e1f703") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.027591 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg"] Dec 05 20:58:17 crc kubenswrapper[4747]: W1205 20:58:17.027763 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod391b5a9d_822c_4980_9eb0_376dd5d44126.slice/crio-0effe9eac3bf6eb10cbc0be842482bee6660728d20e06e548492f5d4f3e22cbc WatchSource:0}: Error finding container 0effe9eac3bf6eb10cbc0be842482bee6660728d20e06e548492f5d4f3e22cbc: Status 404 returned error can't find the container with id 0effe9eac3bf6eb10cbc0be842482bee6660728d20e06e548492f5d4f3e22cbc Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.138968 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv"] Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.149805 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2"] Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.161794 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7k88v"] Dec 05 20:58:17 crc kubenswrapper[4747]: W1205 20:58:17.166353 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe32c200_1897_4d84_9eeb_ec5c5e7b2cd4.slice/crio-1151a7f06909139fd3c2af15df5540080d01c679c8925d5ded0c31da55504f36 WatchSource:0}: Error finding container 1151a7f06909139fd3c2af15df5540080d01c679c8925d5ded0c31da55504f36: Status 404 returned error can't find the container with id 1151a7f06909139fd3c2af15df5540080d01c679c8925d5ded0c31da55504f36 Dec 05 20:58:17 crc kubenswrapper[4747]: W1205 20:58:17.168529 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe975bb9_2086_4fde_8c71_9c7f56eaf8ea.slice/crio-7da4663fe33181318b92815d90f9293a0db29a66ccc88a7bb05ea2542c2ab043 WatchSource:0}: Error finding container 7da4663fe33181318b92815d90f9293a0db29a66ccc88a7bb05ea2542c2ab043: Status 404 returned error can't find the container with id 7da4663fe33181318b92815d90f9293a0db29a66ccc88a7bb05ea2542c2ab043 Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.169388 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56"] Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.170996 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rx7ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-dvpxs_openstack-operators(be975bb9-2086-4fde-8c71-9c7f56eaf8ea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.174050 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg"] Dec 05 20:58:17 crc kubenswrapper[4747]: W1205 20:58:17.174790 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac7a709_1eb7_4f8f_ac46_3a404b2171a7.slice/crio-7c7cc7be01ad9e1e2c631d9a01b27a75bcce1824e61b5eeef550368aba6f126c WatchSource:0}: Error finding container 7c7cc7be01ad9e1e2c631d9a01b27a75bcce1824e61b5eeef550368aba6f126c: Status 404 returned error can't find the container with id 7c7cc7be01ad9e1e2c631d9a01b27a75bcce1824e61b5eeef550368aba6f126c Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.175510 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rx7ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-dvpxs_openstack-operators(be975bb9-2086-4fde-8c71-9c7f56eaf8ea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.176848 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" podUID="be975bb9-2086-4fde-8c71-9c7f56eaf8ea" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.179403 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt"] Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.179919 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qh6bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-8mfrt_openstack-operators(4ac7a709-1eb7-4f8f-ac46-3a404b2171a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.181560 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgnh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-7k88v_openstack-operators(cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.185321 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qh6bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-8mfrt_openstack-operators(4ac7a709-1eb7-4f8f-ac46-3a404b2171a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.185379 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs"] Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.186629 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgnh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-7k88v_openstack-operators(cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.188272 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" podUID="cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.188271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" podUID="4ac7a709-1eb7-4f8f-ac46-3a404b2171a7" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.190164 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm"] Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.190418 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rr2nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-42pxn_openstack-operators(3659825b-dd9f-40b9-a17a-3c931805fe9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.192334 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rr2nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-42pxn_openstack-operators(3659825b-dd9f-40b9-a17a-3c931805fe9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.193899 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podUID="3659825b-dd9f-40b9-a17a-3c931805fe9c" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.197695 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn"] Dec 05 20:58:17 crc kubenswrapper[4747]: W1205 20:58:17.200760 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d3f632_8fb0_4763_adaa_0f54fdb8386f.slice/crio-9cab92941f8b08a09541cd4f1fe5afb0964000d5dd48fcc2976feadef601da71 WatchSource:0}: Error finding container 9cab92941f8b08a09541cd4f1fe5afb0964000d5dd48fcc2976feadef601da71: Status 404 returned error can't find the container with id 9cab92941f8b08a09541cd4f1fe5afb0964000d5dd48fcc2976feadef601da71 Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.205216 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5wxbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-ml4vm_openstack-operators(44d3f632-8fb0-4763-adaa-0f54fdb8386f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.218725 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cqbtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7mb56_openstack-operators(0fbf81b5-1f99-402f-b486-0801f63077e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.220245 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podUID="0fbf81b5-1f99-402f-b486-0801f63077e4" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.220293 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5wxbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-ml4vm_openstack-operators(44d3f632-8fb0-4763-adaa-0f54fdb8386f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.220410 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kmfrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hjzcg_openstack-operators(f46e2df3-4e49-4818-91d9-1181d78e46f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.221908 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" podUID="44d3f632-8fb0-4763-adaa-0f54fdb8386f" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.227730 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kmfrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hjzcg_openstack-operators(f46e2df3-4e49-4818-91d9-1181d78e46f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.229760 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" podUID="f46e2df3-4e49-4818-91d9-1181d78e46f9" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.298072 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" event={"ID":"0fbf81b5-1f99-402f-b486-0801f63077e4","Type":"ContainerStarted","Data":"9c370ec0ec16b5347444a3a4e6edd108b842e09a1b43d3667c0c61fc8d37ef33"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.300415 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podUID="0fbf81b5-1f99-402f-b486-0801f63077e4" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.301924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" event={"ID":"f46e2df3-4e49-4818-91d9-1181d78e46f9","Type":"ContainerStarted","Data":"8e84781bb1612270c39bac2370108099bb0ace3c6061dfed4698ccd898abe912"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.303399 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" event={"ID":"0d3e833b-6977-47a1-8ba3-b09d093f303c","Type":"ContainerStarted","Data":"5346b4cd6c1b2cbba4c5c023f318f166ed51956f210d269685c7b6a3c9b1e074"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.304304 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" podUID="f46e2df3-4e49-4818-91d9-1181d78e46f9" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.305152 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" event={"ID":"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd","Type":"ContainerStarted","Data":"0ea4574d334fa44ed76ddfe616833f87160eab349fe01ec03bed2cb2b88991e9"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.306343 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" event={"ID":"60cd02a8-1ab2-4037-b5d8-e479875ce1db","Type":"ContainerStarted","Data":"25475cf0e37f585934b68e1ba7e7dca3a89323a9897dedbe17e0aaf830a3af10"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.306464 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" podUID="cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.315267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" event={"ID":"0656c174-0497-42ca-9abe-b7c50e82bdec","Type":"ContainerStarted","Data":"d02786863d5ccc29d43747407a878b556937aa735d279463ada0670e1ccf9602"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.325319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" event={"ID":"6b18b68e-5851-4b54-a953-aa0f9151f191","Type":"ContainerStarted","Data":"391fe0ccecef3aafe4306bf33eacd8bc21c740ca53b523d6d19c91922fd6ef0c"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.331328 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" event={"ID":"376f6574-b930-4baa-98c4-d4b47f3b7e76","Type":"ContainerStarted","Data":"559b0bb38d05f088b854fd13ab578f2741a005ff19d5c34d59dfbff6d38c81ea"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.334100 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" event={"ID":"ff9bc6be-0b60-4c4b-a692-5de429d22c0d","Type":"ContainerStarted","Data":"cbab2b95a6797a9df05b18f04b24309ba53920bcec8707bb12348d67e12043c6"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.334896 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" event={"ID":"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4","Type":"ContainerStarted","Data":"1151a7f06909139fd3c2af15df5540080d01c679c8925d5ded0c31da55504f36"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.335629 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" event={"ID":"cb5a89d9-1303-4cbc-9641-8be514157ed0","Type":"ContainerStarted","Data":"e11c0191860ff91c51324f3363d14feadfa65c501367f7c60f02292b318d4859"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.336313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" event={"ID":"44d3f632-8fb0-4763-adaa-0f54fdb8386f","Type":"ContainerStarted","Data":"9cab92941f8b08a09541cd4f1fe5afb0964000d5dd48fcc2976feadef601da71"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.338902 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" event={"ID":"af277afe-81a7-4ca7-8e88-dee226ed11a3","Type":"ContainerStarted","Data":"03023c3e11e4bfb369ec83aca1536503169e0454038135797d566e9b30f1bf69"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.339317 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" podUID="44d3f632-8fb0-4763-adaa-0f54fdb8386f" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.340633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" event={"ID":"54d11a3d-e575-4a91-8856-97ab3f4adb1c","Type":"ContainerStarted","Data":"0227de704643b4d70f2c0b7622ac361fc083374b0a5339add855240a6f04bea4"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.341545 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" event={"ID":"be975bb9-2086-4fde-8c71-9c7f56eaf8ea","Type":"ContainerStarted","Data":"7da4663fe33181318b92815d90f9293a0db29a66ccc88a7bb05ea2542c2ab043"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.344434 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" event={"ID":"4fa9efa6-f9bd-47d1-b27c-701f742537f8","Type":"ContainerStarted","Data":"21774b30dfbb3c1ecbd6d8c2fd273551986bbd440ec8f6e27943cfe50799647a"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.345078 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" podUID="be975bb9-2086-4fde-8c71-9c7f56eaf8ea" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.349720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" event={"ID":"3659825b-dd9f-40b9-a17a-3c931805fe9c","Type":"ContainerStarted","Data":"f9c4f6cb75c59578c88e78025fb91796f3c4b453fd8818a52c11e58e3964c668"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.350930 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" event={"ID":"391b5a9d-822c-4980-9eb0-376dd5d44126","Type":"ContainerStarted","Data":"0effe9eac3bf6eb10cbc0be842482bee6660728d20e06e548492f5d4f3e22cbc"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.351490 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podUID="3659825b-dd9f-40b9-a17a-3c931805fe9c" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.361353 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" event={"ID":"4157fb94-c6ec-4eba-9020-50efd318640f","Type":"ContainerStarted","Data":"30783401c323e446ffdf8093f4f880486572b3d8cc7aa074c2d20d8108bce968"} Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.362833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" event={"ID":"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7","Type":"ContainerStarted","Data":"7c7cc7be01ad9e1e2c631d9a01b27a75bcce1824e61b5eeef550368aba6f126c"} Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.369632 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" podUID="4ac7a709-1eb7-4f8f-ac46-3a404b2171a7" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.418238 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.419516 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.419603 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert podName:9438f8d5-2c04-4bd1-9e4b-cc29d4565085 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:19.419567829 +0000 UTC m=+969.886875317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" (UID: "9438f8d5-2c04-4bd1-9e4b-cc29d4565085") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.825817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:17 crc kubenswrapper[4747]: I1205 20:58:17.825877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.826030 4747 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.826092 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.826114 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:19.826096018 +0000 UTC m=+970.293403506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "metrics-server-cert" not found Dec 05 20:58:17 crc kubenswrapper[4747]: E1205 20:58:17.826153 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:19.826134979 +0000 UTC m=+970.293442557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "webhook-server-cert" not found Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.377143 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podUID="0fbf81b5-1f99-402f-b486-0801f63077e4" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.378981 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" podUID="cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.379035 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" podUID="f46e2df3-4e49-4818-91d9-1181d78e46f9" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.380567 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" podUID="4ac7a709-1eb7-4f8f-ac46-3a404b2171a7" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.380667 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" podUID="be975bb9-2086-4fde-8c71-9c7f56eaf8ea" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.380816 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podUID="3659825b-dd9f-40b9-a17a-3c931805fe9c" Dec 05 20:58:18 crc kubenswrapper[4747]: E1205 20:58:18.381402 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" podUID="44d3f632-8fb0-4763-adaa-0f54fdb8386f" Dec 05 20:58:19 crc kubenswrapper[4747]: I1205 20:58:19.053133 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.053323 4747 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.053383 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert podName:2cb7e739-bfdd-4785-98bc-88c011e1f703 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:23.053363738 +0000 UTC m=+973.520671226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert") pod "infra-operator-controller-manager-57548d458d-n9bsq" (UID: "2cb7e739-bfdd-4785-98bc-88c011e1f703") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: I1205 20:58:19.461566 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.461797 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.461843 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert podName:9438f8d5-2c04-4bd1-9e4b-cc29d4565085 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:23.461828803 +0000 UTC m=+973.929136291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" (UID: "9438f8d5-2c04-4bd1-9e4b-cc29d4565085") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: I1205 20:58:19.868082 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:19 crc kubenswrapper[4747]: I1205 20:58:19.868133 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.868325 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.868365 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:23.868351001 +0000 UTC m=+974.335658489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "webhook-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.868616 4747 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:58:19 crc kubenswrapper[4747]: E1205 20:58:19.868640 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:23.868633118 +0000 UTC m=+974.335940606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "metrics-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: I1205 20:58:23.133955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.134159 4747 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.134614 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert podName:2cb7e739-bfdd-4785-98bc-88c011e1f703 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:31.134596286 +0000 UTC m=+981.601903774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert") pod "infra-operator-controller-manager-57548d458d-n9bsq" (UID: "2cb7e739-bfdd-4785-98bc-88c011e1f703") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: I1205 20:58:23.541294 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.541554 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.541755 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert podName:9438f8d5-2c04-4bd1-9e4b-cc29d4565085 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:31.541702858 +0000 UTC m=+982.009010346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" (UID: "9438f8d5-2c04-4bd1-9e4b-cc29d4565085") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: I1205 20:58:23.946497 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:23 crc kubenswrapper[4747]: I1205 20:58:23.946546 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.946677 4747 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.946677 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.946758 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:31.946736451 +0000 UTC m=+982.414044029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "metrics-server-cert" not found Dec 05 20:58:23 crc kubenswrapper[4747]: E1205 20:58:23.946871 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs podName:fbdd4eb9-cffd-44b7-9b05-da1328de2fe6 nodeName:}" failed. No retries permitted until 2025-12-05 20:58:31.946833363 +0000 UTC m=+982.414140851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-rsqv9" (UID: "fbdd4eb9-cffd-44b7-9b05-da1328de2fe6") : secret "webhook-server-cert" not found Dec 05 20:58:30 crc kubenswrapper[4747]: E1205 20:58:30.008500 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 05 20:58:30 crc kubenswrapper[4747]: E1205 20:58:30.009813 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cjz7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-lgxh8_openstack-operators(4fa9efa6-f9bd-47d1-b27c-701f742537f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.170782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.182160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cb7e739-bfdd-4785-98bc-88c011e1f703-cert\") pod \"infra-operator-controller-manager-57548d458d-n9bsq\" (UID: \"2cb7e739-bfdd-4785-98bc-88c011e1f703\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.184536 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:31 crc kubenswrapper[4747]: E1205 20:58:31.219984 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 05 20:58:31 crc kubenswrapper[4747]: E1205 20:58:31.220170 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jm68m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-tw5n8_openstack-operators(af277afe-81a7-4ca7-8e88-dee226ed11a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.577275 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.581594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9438f8d5-2c04-4bd1-9e4b-cc29d4565085-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5dbwbl\" (UID: \"9438f8d5-2c04-4bd1-9e4b-cc29d4565085\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.618892 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:31 crc kubenswrapper[4747]: E1205 20:58:31.898479 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 05 20:58:31 crc kubenswrapper[4747]: E1205 20:58:31.898750 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l7xsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-kz2bv_openstack-operators(cb5a89d9-1303-4cbc-9641-8be514157ed0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.984699 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.984754 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.988848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:31 crc kubenswrapper[4747]: I1205 20:58:31.990055 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbdd4eb9-cffd-44b7-9b05-da1328de2fe6-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-rsqv9\" (UID: \"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:32 crc kubenswrapper[4747]: I1205 20:58:32.091428 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:38 crc kubenswrapper[4747]: E1205 20:58:38.586423 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 05 20:58:38 crc kubenswrapper[4747]: E1205 20:58:38.587331 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6msp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-7lrth_openstack-operators(0d3e833b-6977-47a1-8ba3-b09d093f303c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:39 crc kubenswrapper[4747]: E1205 20:58:39.688727 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 20:58:39 crc kubenswrapper[4747]: E1205 20:58:39.688913 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7t9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-s2xls_openstack-operators(4157fb94-c6ec-4eba-9020-50efd318640f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:44 crc kubenswrapper[4747]: E1205 20:58:44.218776 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 20:58:44 crc kubenswrapper[4747]: E1205 20:58:44.219752 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cqbtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7mb56_openstack-operators(0fbf81b5-1f99-402f-b486-0801f63077e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:58:44 crc kubenswrapper[4747]: E1205 20:58:44.221068 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podUID="0fbf81b5-1f99-402f-b486-0801f63077e4" Dec 05 20:58:44 crc kubenswrapper[4747]: I1205 20:58:44.821812 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq"] Dec 05 20:58:44 crc kubenswrapper[4747]: I1205 20:58:44.827747 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9"] Dec 05 20:58:44 crc kubenswrapper[4747]: W1205 20:58:44.893072 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb7e739_bfdd_4785_98bc_88c011e1f703.slice/crio-a74df75bc12c76aeef304ec0b7434cf322d81efc96187f561cd5c24293d19e3b WatchSource:0}: Error finding container a74df75bc12c76aeef304ec0b7434cf322d81efc96187f561cd5c24293d19e3b: Status 404 returned error can't find the container with id a74df75bc12c76aeef304ec0b7434cf322d81efc96187f561cd5c24293d19e3b Dec 05 20:58:44 crc kubenswrapper[4747]: I1205 20:58:44.913223 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl"] Dec 05 20:58:45 crc kubenswrapper[4747]: W1205 20:58:45.055021 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9438f8d5_2c04_4bd1_9e4b_cc29d4565085.slice/crio-35da48908af7222f400eeb844cd7982b50549344a1ea702d7854a390da053360 WatchSource:0}: Error finding container 35da48908af7222f400eeb844cd7982b50549344a1ea702d7854a390da053360: Status 404 returned error can't find the container with id 35da48908af7222f400eeb844cd7982b50549344a1ea702d7854a390da053360 Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.636074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" event={"ID":"6b18b68e-5851-4b54-a953-aa0f9151f191","Type":"ContainerStarted","Data":"8afedfd34b51864a8a00474dfce4a0c2e71c802814ec67e7ee44140fcf22fcec"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.642193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" event={"ID":"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6","Type":"ContainerStarted","Data":"a288e01cb09eb4fdb32c8813a032cafb4083cb391e008e9b506221407f560719"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.649026 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" event={"ID":"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd","Type":"ContainerStarted","Data":"4ab6050a1d9ffab026a37d985d67163e4908a65927cfed452784a5498a517e57"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.650229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" event={"ID":"2cb7e739-bfdd-4785-98bc-88c011e1f703","Type":"ContainerStarted","Data":"a74df75bc12c76aeef304ec0b7434cf322d81efc96187f561cd5c24293d19e3b"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.651742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" event={"ID":"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4","Type":"ContainerStarted","Data":"8179b90febfccfc2889760f3d48d23434d057125210ce7a8c6b52ab61c4d1074"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.658128 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" event={"ID":"9438f8d5-2c04-4bd1-9e4b-cc29d4565085","Type":"ContainerStarted","Data":"35da48908af7222f400eeb844cd7982b50549344a1ea702d7854a390da053360"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.670893 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" event={"ID":"54d11a3d-e575-4a91-8856-97ab3f4adb1c","Type":"ContainerStarted","Data":"8344c3402b283146dd5b0135f1dbd9e0809f0b84e2d883eac55ef81d3f7af8ee"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.679624 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" event={"ID":"60cd02a8-1ab2-4037-b5d8-e479875ce1db","Type":"ContainerStarted","Data":"fb3839dbdf6c9700522b46f2bba57c6520188e21f4de2b7c240596565b36722d"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.703495 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" event={"ID":"0656c174-0497-42ca-9abe-b7c50e82bdec","Type":"ContainerStarted","Data":"971aeb1ccac1fdb6fd6acef620f7166ce444e9327d0ce6abd4033cea89a968d0"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.710156 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" event={"ID":"ff9bc6be-0b60-4c4b-a692-5de429d22c0d","Type":"ContainerStarted","Data":"944999c4637c854d47f53ef0537ff9d59af7177a4d1aef788a6e9ac09c007a75"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.712483 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" event={"ID":"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7","Type":"ContainerStarted","Data":"6a11fef14e4e5dcb4bedee6faf1c9e27b8af6e5220815e8793ce80e55a78d611"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.730628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" event={"ID":"be975bb9-2086-4fde-8c71-9c7f56eaf8ea","Type":"ContainerStarted","Data":"5b70baafae7ec2c11dfa54239a3ccb47a4782646af414815bf17637a9438ac05"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.737776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" event={"ID":"376f6574-b930-4baa-98c4-d4b47f3b7e76","Type":"ContainerStarted","Data":"3a3e773fa351d8bc3edcd596f7f38ff0d40152028a93d382b88d9ce55b5c92ea"} Dec 05 20:58:45 crc kubenswrapper[4747]: I1205 20:58:45.753504 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" event={"ID":"391b5a9d-822c-4980-9eb0-376dd5d44126","Type":"ContainerStarted","Data":"63cb86025cb26d2286855ac3bf5af93f1321b756164cbd9102bd8ab3ffa0a568"} Dec 05 20:58:45 crc kubenswrapper[4747]: E1205 20:58:45.882207 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rr2nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-42pxn_openstack-operators(3659825b-dd9f-40b9-a17a-3c931805fe9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:58:45 crc kubenswrapper[4747]: E1205 20:58:45.885723 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podUID="3659825b-dd9f-40b9-a17a-3c931805fe9c" Dec 05 20:58:46 crc kubenswrapper[4747]: I1205 20:58:46.762669 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" event={"ID":"44d3f632-8fb0-4763-adaa-0f54fdb8386f","Type":"ContainerStarted","Data":"cf1564e93f9363251bc4b03363c9a50cf7c612efcaa38b19eeffa46ccc1afa8a"} Dec 05 20:58:46 crc kubenswrapper[4747]: I1205 20:58:46.765050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" event={"ID":"3659825b-dd9f-40b9-a17a-3c931805fe9c","Type":"ContainerStarted","Data":"07b85af8be11ccc14fa5060c5976d550564dce068c28409a9c3b0b3abd37d6f8"} Dec 05 20:58:46 crc kubenswrapper[4747]: I1205 20:58:46.765876 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:46 crc kubenswrapper[4747]: E1205 20:58:46.766822 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podUID="3659825b-dd9f-40b9-a17a-3c931805fe9c" Dec 05 20:58:47 crc kubenswrapper[4747]: I1205 20:58:47.786762 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" event={"ID":"f46e2df3-4e49-4818-91d9-1181d78e46f9","Type":"ContainerStarted","Data":"bbd59f4ef732e39a95a07b068ca653049f0b659c65aca9dbb378372b66f2f87a"} Dec 05 20:58:47 crc kubenswrapper[4747]: I1205 20:58:47.806009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" event={"ID":"fbdd4eb9-cffd-44b7-9b05-da1328de2fe6","Type":"ContainerStarted","Data":"ea563abf93afc808f96e2e144fb77ee79f660d7a736d1316621502b1d0c74a50"} Dec 05 20:58:47 crc kubenswrapper[4747]: I1205 20:58:47.806049 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:47 crc kubenswrapper[4747]: I1205 20:58:47.863932 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" podStartSLOduration=32.863913877 podStartE2EDuration="32.863913877s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:58:47.851851497 +0000 UTC m=+998.319158995" watchObservedRunningTime="2025-12-05 20:58:47.863913877 +0000 UTC m=+998.331221365" Dec 05 20:58:49 crc kubenswrapper[4747]: E1205 20:58:49.256307 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" podUID="cb5a89d9-1303-4cbc-9641-8be514157ed0" Dec 05 20:58:49 crc kubenswrapper[4747]: E1205 20:58:49.262693 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" podUID="0d3e833b-6977-47a1-8ba3-b09d093f303c" Dec 05 20:58:49 crc kubenswrapper[4747]: E1205 20:58:49.272958 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" podUID="af277afe-81a7-4ca7-8e88-dee226ed11a3" Dec 05 20:58:49 crc kubenswrapper[4747]: E1205 20:58:49.654510 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" podUID="4157fb94-c6ec-4eba-9020-50efd318640f" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.864480 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" event={"ID":"3659825b-dd9f-40b9-a17a-3c931805fe9c","Type":"ContainerStarted","Data":"13ac7cf0665ce7a86b63ac94771cd6882b3a1767edd5409519f6ad0ef0a4622c"} Dec 05 20:58:49 crc kubenswrapper[4747]: E1205 20:58:49.870115 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" podUID="4fa9efa6-f9bd-47d1-b27c-701f742537f8" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.870288 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.873489 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" event={"ID":"54d11a3d-e575-4a91-8856-97ab3f4adb1c","Type":"ContainerStarted","Data":"6991c9ad1b5164fe96d0c655b6ee8eec4d5d680b76c35a86432604d7856e2586"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.874173 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.882326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" event={"ID":"af277afe-81a7-4ca7-8e88-dee226ed11a3","Type":"ContainerStarted","Data":"710b3dbd8a1e0ebecc1ba73fc960fca283a2f226cbe97615ee8b7aae281fcd3a"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.898417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" event={"ID":"cb5a89d9-1303-4cbc-9641-8be514157ed0","Type":"ContainerStarted","Data":"ce0245e8b3d66f079df03a9703306327eed584d39011c7271b64b87aace5d084"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.912158 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" event={"ID":"0d3e833b-6977-47a1-8ba3-b09d093f303c","Type":"ContainerStarted","Data":"15df89c665f98da45c1988699913a9909c32b22bf60a7c0c1dbddf305cb06ed4"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.918388 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" event={"ID":"391b5a9d-822c-4980-9eb0-376dd5d44126","Type":"ContainerStarted","Data":"d210d1a4016668acc839fd587f5d01399dd04126a324b6db5e62bfd3316f84ea"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.918565 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.931034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" event={"ID":"cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd","Type":"ContainerStarted","Data":"374d707b8f0ad0b58de7549665ec08d9606145740eea5e2dec2b76d106cca0ba"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.931537 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.936905 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" event={"ID":"60cd02a8-1ab2-4037-b5d8-e479875ce1db","Type":"ContainerStarted","Data":"fecae147280e8408f42afeded6f9e3ad08697ba19fd6535d78aa6ef065dc554b"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.937598 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.942623 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" event={"ID":"6b18b68e-5851-4b54-a953-aa0f9151f191","Type":"ContainerStarted","Data":"b2668d3941b235b4989fa533ac07b4c3e7b340d9d82953f926fca800066a8ce2"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.943258 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.946124 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" podStartSLOduration=4.442339733 podStartE2EDuration="34.94610516s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.004852294 +0000 UTC m=+967.472159782" lastFinishedPulling="2025-12-05 20:58:47.508617721 +0000 UTC m=+997.975925209" observedRunningTime="2025-12-05 20:58:49.941407737 +0000 UTC m=+1000.408715255" watchObservedRunningTime="2025-12-05 20:58:49.94610516 +0000 UTC m=+1000.413412648" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.946866 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" event={"ID":"f46e2df3-4e49-4818-91d9-1181d78e46f9","Type":"ContainerStarted","Data":"a8caac623fbf10574a608f5e7f77641fc22ab8c4435be2ea2ae0cfb454b5dc8a"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.947522 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.961645 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" event={"ID":"be975bb9-2086-4fde-8c71-9c7f56eaf8ea","Type":"ContainerStarted","Data":"c5dd9aaf1090597d2dce429db597eebbbdeafdacf8efef98c87159414b522ea1"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.962313 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.968016 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" event={"ID":"44d3f632-8fb0-4763-adaa-0f54fdb8386f","Type":"ContainerStarted","Data":"ec23447af85c6107878806569a627a0e1dd6e6c2c2a3b1d31a735c577e208226"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.968617 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.971325 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" podStartSLOduration=3.175804668 podStartE2EDuration="34.971307296s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.039490347 +0000 UTC m=+967.506797825" lastFinishedPulling="2025-12-05 20:58:48.834992965 +0000 UTC m=+999.302300453" observedRunningTime="2025-12-05 20:58:49.971014799 +0000 UTC m=+1000.438322287" watchObservedRunningTime="2025-12-05 20:58:49.971307296 +0000 UTC m=+1000.438614784" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.981137 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" event={"ID":"0656c174-0497-42ca-9abe-b7c50e82bdec","Type":"ContainerStarted","Data":"4fc09f7778a94d4967871cebdeceb263d52bbc83cd49e97f44ab551e4ad492b2"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.982007 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.993348 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" event={"ID":"2cb7e739-bfdd-4785-98bc-88c011e1f703","Type":"ContainerStarted","Data":"e271825e8f4f749e6a75ccc57ca7a7182d39c8fddd8c53457ba31ef22595cde6"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.993400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" event={"ID":"2cb7e739-bfdd-4785-98bc-88c011e1f703","Type":"ContainerStarted","Data":"394880500c3c0f3f958bb9df269e4f22ae791244ed8c3792fd3907364edef0b9"} Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.994069 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:58:49 crc kubenswrapper[4747]: I1205 20:58:49.995173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" event={"ID":"9438f8d5-2c04-4bd1-9e4b-cc29d4565085","Type":"ContainerStarted","Data":"76328f7f3717af4293e5eaaaef3b866e15728ab23d54723a78cd95bd1b3cb2d8"} Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.003743 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" event={"ID":"376f6574-b930-4baa-98c4-d4b47f3b7e76","Type":"ContainerStarted","Data":"dc29f2fad95b8584549c6accf029d84cad422da5bd1a774b49ce5d33dad8d2d7"} Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.004788 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.011686 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.019334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" event={"ID":"ff9bc6be-0b60-4c4b-a692-5de429d22c0d","Type":"ContainerStarted","Data":"d80891b6d961f13528a47a1e45a84ef1b1ae0f5df0c99cc517ea2727b49d30fc"} Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.020398 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.026158 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.030763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" event={"ID":"4157fb94-c6ec-4eba-9020-50efd318640f","Type":"ContainerStarted","Data":"0d0e605eb46a8e569fba09fe045474ddaaf759a72750257d5cc56f7767abec34"} Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.053026 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" event={"ID":"4ac7a709-1eb7-4f8f-ac46-3a404b2171a7","Type":"ContainerStarted","Data":"c95f6febb6d6dbdf052333eec3cbdbc92689e78d7eb2d76205aa84307fc90677"} Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.053606 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.055633 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.070258 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" podStartSLOduration=3.305337835 podStartE2EDuration="35.070236596s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.168602893 +0000 UTC m=+967.635910381" lastFinishedPulling="2025-12-05 20:58:48.933501654 +0000 UTC m=+999.400809142" observedRunningTime="2025-12-05 20:58:50.063997986 +0000 UTC m=+1000.531305474" watchObservedRunningTime="2025-12-05 20:58:50.070236596 +0000 UTC m=+1000.537544084" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.094734 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" podStartSLOduration=7.967048485 podStartE2EDuration="35.094704205s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.190327305 +0000 UTC m=+967.657634793" lastFinishedPulling="2025-12-05 20:58:44.317983025 +0000 UTC m=+994.785290513" observedRunningTime="2025-12-05 20:58:50.08702632 +0000 UTC m=+1000.554333808" watchObservedRunningTime="2025-12-05 20:58:50.094704205 +0000 UTC m=+1000.562011733" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.168619 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" podStartSLOduration=4.092200687 podStartE2EDuration="36.168604502s" podCreationTimestamp="2025-12-05 20:58:14 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.896171779 +0000 UTC m=+967.363479257" lastFinishedPulling="2025-12-05 20:58:48.972575584 +0000 UTC m=+999.439883072" observedRunningTime="2025-12-05 20:58:50.164546295 +0000 UTC m=+1000.631853783" watchObservedRunningTime="2025-12-05 20:58:50.168604502 +0000 UTC m=+1000.635911990" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.198827 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-8mfrt" podStartSLOduration=3.568991696 podStartE2EDuration="35.198810989s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.179778121 +0000 UTC m=+967.647085609" lastFinishedPulling="2025-12-05 20:58:48.809597414 +0000 UTC m=+999.276904902" observedRunningTime="2025-12-05 20:58:50.195190552 +0000 UTC m=+1000.662498050" watchObservedRunningTime="2025-12-05 20:58:50.198810989 +0000 UTC m=+1000.666118477" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.224520 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" podStartSLOduration=4.895124105 podStartE2EDuration="35.224499987s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.181468572 +0000 UTC m=+967.648776060" lastFinishedPulling="2025-12-05 20:58:47.510844454 +0000 UTC m=+997.978151942" observedRunningTime="2025-12-05 20:58:50.221007683 +0000 UTC m=+1000.688315171" watchObservedRunningTime="2025-12-05 20:58:50.224499987 +0000 UTC m=+1000.691807475" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.251367 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" podStartSLOduration=3.5157701770000003 podStartE2EDuration="35.251353423s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.170809916 +0000 UTC m=+967.638117404" lastFinishedPulling="2025-12-05 20:58:48.906393152 +0000 UTC m=+999.373700650" observedRunningTime="2025-12-05 20:58:50.246950067 +0000 UTC m=+1000.714257555" watchObservedRunningTime="2025-12-05 20:58:50.251353423 +0000 UTC m=+1000.718660911" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.271049 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" podStartSLOduration=4.642400336 podStartE2EDuration="35.271032736s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.865466521 +0000 UTC m=+967.332774009" lastFinishedPulling="2025-12-05 20:58:47.494098921 +0000 UTC m=+997.961406409" observedRunningTime="2025-12-05 20:58:50.266232401 +0000 UTC m=+1000.733539889" watchObservedRunningTime="2025-12-05 20:58:50.271032736 +0000 UTC m=+1000.738340224" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.295448 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-p9kpn" podStartSLOduration=5.221429771 podStartE2EDuration="36.295431743s" podCreationTimestamp="2025-12-05 20:58:14 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.419331471 +0000 UTC m=+966.886638959" lastFinishedPulling="2025-12-05 20:58:47.493333443 +0000 UTC m=+997.960640931" observedRunningTime="2025-12-05 20:58:50.290701599 +0000 UTC m=+1000.758009087" watchObservedRunningTime="2025-12-05 20:58:50.295431743 +0000 UTC m=+1000.762739231" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.329311 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" podStartSLOduration=31.328201087 podStartE2EDuration="35.329280577s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:44.901841069 +0000 UTC m=+995.369148567" lastFinishedPulling="2025-12-05 20:58:48.902920559 +0000 UTC m=+999.370228057" observedRunningTime="2025-12-05 20:58:50.320166508 +0000 UTC m=+1000.787474016" watchObservedRunningTime="2025-12-05 20:58:50.329280577 +0000 UTC m=+1000.796588065" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.376649 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" podStartSLOduration=8.146194204 podStartE2EDuration="35.376628966s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.220336997 +0000 UTC m=+967.687644485" lastFinishedPulling="2025-12-05 20:58:44.450771759 +0000 UTC m=+994.918079247" observedRunningTime="2025-12-05 20:58:50.352027554 +0000 UTC m=+1000.819335042" watchObservedRunningTime="2025-12-05 20:58:50.376628966 +0000 UTC m=+1000.843936454" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.456726 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" podStartSLOduration=3.486579314 podStartE2EDuration="35.456696142s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.897604094 +0000 UTC m=+967.364911582" lastFinishedPulling="2025-12-05 20:58:48.867720922 +0000 UTC m=+999.335028410" observedRunningTime="2025-12-05 20:58:50.455309749 +0000 UTC m=+1000.922617237" watchObservedRunningTime="2025-12-05 20:58:50.456696142 +0000 UTC m=+1000.924003630" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.464967 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6qc7c" podStartSLOduration=4.427217896 podStartE2EDuration="36.46492774s" podCreationTimestamp="2025-12-05 20:58:14 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.895697238 +0000 UTC m=+967.363004726" lastFinishedPulling="2025-12-05 20:58:48.933407072 +0000 UTC m=+999.400714570" observedRunningTime="2025-12-05 20:58:50.404558618 +0000 UTC m=+1000.871866106" watchObservedRunningTime="2025-12-05 20:58:50.46492774 +0000 UTC m=+1000.932235228" Dec 05 20:58:50 crc kubenswrapper[4747]: I1205 20:58:50.507341 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" podStartSLOduration=3.784814378 podStartE2EDuration="35.50732194s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.204915106 +0000 UTC m=+967.672222594" lastFinishedPulling="2025-12-05 20:58:48.927422668 +0000 UTC m=+999.394730156" observedRunningTime="2025-12-05 20:58:50.507026983 +0000 UTC m=+1000.974334471" watchObservedRunningTime="2025-12-05 20:58:50.50732194 +0000 UTC m=+1000.974629428" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.063127 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" event={"ID":"9438f8d5-2c04-4bd1-9e4b-cc29d4565085","Type":"ContainerStarted","Data":"97041670fab37684bc13f2a2021ada88f811fc42986263db75b20be412ec1dc5"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.063617 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.065801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" event={"ID":"0d3e833b-6977-47a1-8ba3-b09d093f303c","Type":"ContainerStarted","Data":"438537dfaf11dca461bd27adde5b7045505da76a3383529b57055a4c09416ca6"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.065966 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.068091 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" event={"ID":"af277afe-81a7-4ca7-8e88-dee226ed11a3","Type":"ContainerStarted","Data":"36e6c46420b312b74ce97fb4e627096182cd65c17064c5c0aeda1273c59ef000"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.068206 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.070018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" event={"ID":"4fa9efa6-f9bd-47d1-b27c-701f742537f8","Type":"ContainerStarted","Data":"110ce477e4038c5908727765640b66698155cecf6434d9f2ea5a9e3c29b16079"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.072045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" event={"ID":"4157fb94-c6ec-4eba-9020-50efd318640f","Type":"ContainerStarted","Data":"419bf1bc14334e6625b539d226265d0aa740e07d668353ac3c074735a83d958b"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.072502 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.078236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" event={"ID":"fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4","Type":"ContainerStarted","Data":"47ccc90a8aefb8acd1d9f82d7d2777afc3d42a44c534963928eb919e672d3061"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.082154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" event={"ID":"cb5a89d9-1303-4cbc-9641-8be514157ed0","Type":"ContainerStarted","Data":"52cbb32bdc0ed40cf5348e785ce0aef5a48aff550c5256094207002bf0fb05b5"} Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.082204 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.086090 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8ltc2" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.087142 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f7fp2" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.087937 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ml4vm" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.088169 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7k88v" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.088474 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-llksb" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.088703 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-dvpxs" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.088862 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-8c9z4" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.089042 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lcwxh" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.093230 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lc7bg" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.140023 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" podStartSLOduration=32.332452363 podStartE2EDuration="36.140006878s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:45.060536986 +0000 UTC m=+995.527844474" lastFinishedPulling="2025-12-05 20:58:48.868091501 +0000 UTC m=+999.335398989" observedRunningTime="2025-12-05 20:58:51.113130831 +0000 UTC m=+1001.580438319" watchObservedRunningTime="2025-12-05 20:58:51.140006878 +0000 UTC m=+1001.607314366" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.232166 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" podStartSLOduration=2.682033883 podStartE2EDuration="36.232144844s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.827872408 +0000 UTC m=+967.295179906" lastFinishedPulling="2025-12-05 20:58:50.377983389 +0000 UTC m=+1000.845290867" observedRunningTime="2025-12-05 20:58:51.225478444 +0000 UTC m=+1001.692785932" watchObservedRunningTime="2025-12-05 20:58:51.232144844 +0000 UTC m=+1001.699452332" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.305973 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" podStartSLOduration=2.633526646 podStartE2EDuration="36.30595538s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.034705632 +0000 UTC m=+967.502013120" lastFinishedPulling="2025-12-05 20:58:50.707134366 +0000 UTC m=+1001.174441854" observedRunningTime="2025-12-05 20:58:51.302313372 +0000 UTC m=+1001.769620860" watchObservedRunningTime="2025-12-05 20:58:51.30595538 +0000 UTC m=+1001.773262868" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.323068 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" podStartSLOduration=3.102814144 podStartE2EDuration="36.323038351s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.157871914 +0000 UTC m=+967.625179402" lastFinishedPulling="2025-12-05 20:58:50.378096131 +0000 UTC m=+1000.845403609" observedRunningTime="2025-12-05 20:58:51.319851374 +0000 UTC m=+1001.787158862" watchObservedRunningTime="2025-12-05 20:58:51.323038351 +0000 UTC m=+1001.790345839" Dec 05 20:58:51 crc kubenswrapper[4747]: I1205 20:58:51.452321 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" podStartSLOduration=2.995278978 podStartE2EDuration="36.45230356s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.895675338 +0000 UTC m=+967.362982826" lastFinishedPulling="2025-12-05 20:58:50.35269992 +0000 UTC m=+1000.820007408" observedRunningTime="2025-12-05 20:58:51.446837478 +0000 UTC m=+1001.914144966" watchObservedRunningTime="2025-12-05 20:58:51.45230356 +0000 UTC m=+1001.919611048" Dec 05 20:58:52 crc kubenswrapper[4747]: I1205 20:58:52.094842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" event={"ID":"4fa9efa6-f9bd-47d1-b27c-701f742537f8","Type":"ContainerStarted","Data":"36123fd1b5164f40b017dadb8b23b4e7381c8e2197b9607f4da4b44ba36a6cbe"} Dec 05 20:58:52 crc kubenswrapper[4747]: I1205 20:58:52.098419 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hjzcg" Dec 05 20:58:52 crc kubenswrapper[4747]: I1205 20:58:52.103825 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-rsqv9" Dec 05 20:58:52 crc kubenswrapper[4747]: I1205 20:58:52.119803 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" podStartSLOduration=3.150104968 podStartE2EDuration="38.119782565s" podCreationTimestamp="2025-12-05 20:58:14 +0000 UTC" firstStartedPulling="2025-12-05 20:58:16.505412341 +0000 UTC m=+966.972719829" lastFinishedPulling="2025-12-05 20:58:51.475089938 +0000 UTC m=+1001.942397426" observedRunningTime="2025-12-05 20:58:52.113858423 +0000 UTC m=+1002.581165931" watchObservedRunningTime="2025-12-05 20:58:52.119782565 +0000 UTC m=+1002.587090053" Dec 05 20:58:53 crc kubenswrapper[4747]: I1205 20:58:53.112088 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:58:55 crc kubenswrapper[4747]: I1205 20:58:55.400147 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tw5n8" Dec 05 20:58:55 crc kubenswrapper[4747]: I1205 20:58:55.706288 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s2xls" Dec 05 20:58:55 crc kubenswrapper[4747]: I1205 20:58:55.798065 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-7lrth" Dec 05 20:58:55 crc kubenswrapper[4747]: E1205 20:58:55.841046 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podUID="0fbf81b5-1f99-402f-b486-0801f63077e4" Dec 05 20:58:55 crc kubenswrapper[4747]: I1205 20:58:55.941743 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-42pxn" Dec 05 20:58:55 crc kubenswrapper[4747]: I1205 20:58:55.975955 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kz2bv" Dec 05 20:59:01 crc kubenswrapper[4747]: I1205 20:59:01.194614 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n9bsq" Dec 05 20:59:01 crc kubenswrapper[4747]: I1205 20:59:01.627691 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5dbwbl" Dec 05 20:59:05 crc kubenswrapper[4747]: I1205 20:59:05.322738 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lgxh8" Dec 05 20:59:07 crc kubenswrapper[4747]: I1205 20:59:07.842513 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:59:13 crc kubenswrapper[4747]: I1205 20:59:13.276342 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" event={"ID":"0fbf81b5-1f99-402f-b486-0801f63077e4","Type":"ContainerStarted","Data":"1cc629f14955ac1b4f61cd180a1d2fec19fb697f8af24fa5a46df54a7a8c254e"} Dec 05 20:59:13 crc kubenswrapper[4747]: I1205 20:59:13.303049 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7mb56" podStartSLOduration=2.653074987 podStartE2EDuration="58.303027935s" podCreationTimestamp="2025-12-05 20:58:15 +0000 UTC" firstStartedPulling="2025-12-05 20:58:17.218574265 +0000 UTC m=+967.685881753" lastFinishedPulling="2025-12-05 20:59:12.868527193 +0000 UTC m=+1023.335834701" observedRunningTime="2025-12-05 20:59:13.296740163 +0000 UTC m=+1023.764047661" watchObservedRunningTime="2025-12-05 20:59:13.303027935 +0000 UTC m=+1023.770335443" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.381703 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.384037 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.390067 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.390326 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.390692 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r5ng6" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.391670 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.403219 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.445574 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.445679 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnhww\" (UniqueName: \"kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.474528 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.475746 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.477947 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.496208 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.546823 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.546873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.546913 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnhww\" (UniqueName: \"kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.546940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgvjq\" (UniqueName: \"kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.546959 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.547841 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.571306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnhww\" (UniqueName: \"kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww\") pod \"dnsmasq-dns-5cd484bb89-ts844\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.647802 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.647883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgvjq\" (UniqueName: \"kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.647910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.648901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.648966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.678701 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgvjq\" (UniqueName: \"kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq\") pod \"dnsmasq-dns-567c455747-dxcxj\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.715884 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:28 crc kubenswrapper[4747]: I1205 20:59:28.789650 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:29 crc kubenswrapper[4747]: I1205 20:59:29.188408 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:29 crc kubenswrapper[4747]: I1205 20:59:29.276148 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:29 crc kubenswrapper[4747]: I1205 20:59:29.400738 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" event={"ID":"4b728720-52af-4c9a-b411-e64b9d0b8249","Type":"ContainerStarted","Data":"0f49f3a454653de1ebf85f24b7e08d87c8b4647e7a1b86ee622f50945c581c29"} Dec 05 20:59:29 crc kubenswrapper[4747]: I1205 20:59:29.402207 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-dxcxj" event={"ID":"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61","Type":"ContainerStarted","Data":"2036b9b808827a4b879b8850829bdde9816935a7d45948926dc423cf1eef5133"} Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.166733 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.204883 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.206103 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.232219 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.320917 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.321139 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58sx\" (UniqueName: \"kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.321385 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.422417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.422492 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.422535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58sx\" (UniqueName: \"kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.423377 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.425625 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.454517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58sx\" (UniqueName: \"kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx\") pod \"dnsmasq-dns-859d485f47-h9ph7\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.492062 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.533315 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.539507 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.541364 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.545712 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.629421 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.629529 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.629647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbhkv\" (UniqueName: \"kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.733136 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.733625 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.733723 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbhkv\" (UniqueName: \"kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.734521 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.734677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.778179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbhkv\" (UniqueName: \"kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv\") pod \"dnsmasq-dns-cb666b895-kbblh\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:31 crc kubenswrapper[4747]: I1205 20:59:31.908054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.077322 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 20:59:32 crc kubenswrapper[4747]: W1205 20:59:32.086693 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62c2f197_894b_471b_a5c1_968b7b951427.slice/crio-51aa775ac39a39850cae5e970ca54ca1584cc57f0f0adfa5e2ee34537c9183c4 WatchSource:0}: Error finding container 51aa775ac39a39850cae5e970ca54ca1584cc57f0f0adfa5e2ee34537c9183c4: Status 404 returned error can't find the container with id 51aa775ac39a39850cae5e970ca54ca1584cc57f0f0adfa5e2ee34537c9183c4 Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.363899 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.365668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.368283 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vf7v2" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.369775 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.380126 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.380331 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.380438 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.381099 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.381180 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.382181 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.389869 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 20:59:32 crc kubenswrapper[4747]: W1205 20:59:32.411001 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffe619cc_656e_4967_8a21_aaea4560c00b.slice/crio-11d1809d33ade7b69410b2afb4a8ecc60b43b07ad949e510980dd28e34297002 WatchSource:0}: Error finding container 11d1809d33ade7b69410b2afb4a8ecc60b43b07ad949e510980dd28e34297002: Status 404 returned error can't find the container with id 11d1809d33ade7b69410b2afb4a8ecc60b43b07ad949e510980dd28e34297002 Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.446946 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" event={"ID":"62c2f197-894b-471b-a5c1-968b7b951427","Type":"ContainerStarted","Data":"51aa775ac39a39850cae5e970ca54ca1584cc57f0f0adfa5e2ee34537c9183c4"} Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451544 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451649 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451680 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451699 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451832 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhth\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.451929 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.452067 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.452289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-kbblh" event={"ID":"ffe619cc-656e-4967-8a21-aaea4560c00b","Type":"ContainerStarted","Data":"11d1809d33ade7b69410b2afb4a8ecc60b43b07ad949e510980dd28e34297002"} Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558505 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558569 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558613 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558702 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhth\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558807 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558838 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.558867 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.561297 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.563455 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.564426 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.564767 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.565066 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.566810 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.570779 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.574827 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.584450 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.598165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.613388 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhth\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.632124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.686288 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.687533 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.691531 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.691835 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.694914 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.695088 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j9h67" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.695244 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.695367 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.695734 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.702620 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.709380 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.764845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.764899 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.764946 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765062 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62dbq\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765191 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765317 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765361 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.765454 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867247 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867289 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867313 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867335 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867389 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867407 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62dbq\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.867431 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.868160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.868924 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.869123 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.869922 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.870390 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.872021 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.885978 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.887312 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.887818 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.888057 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.896132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62dbq\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:32 crc kubenswrapper[4747]: I1205 20:59:32.906084 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " pod="openstack/rabbitmq-server-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.020322 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.295302 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.466001 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerStarted","Data":"204d13c5b120dd8f33840db73ad85ea7769396cfb30a30e69cdda4aab0ccb346"} Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.558074 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 20:59:33 crc kubenswrapper[4747]: W1205 20:59:33.580930 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49bd09d_90af_4f00_a333_0e292c581525.slice/crio-b9fd05f438e03f7aacb5322db9f9e8ef18159d42646955d6467db61ef333dc34 WatchSource:0}: Error finding container b9fd05f438e03f7aacb5322db9f9e8ef18159d42646955d6467db61ef333dc34: Status 404 returned error can't find the container with id b9fd05f438e03f7aacb5322db9f9e8ef18159d42646955d6467db61ef333dc34 Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.900346 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.902839 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.914494 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.919099 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.919336 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.923024 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kf8mz" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.923341 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.943500 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990793 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990875 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990919 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.990969 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvl2s\" (UniqueName: \"kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.991043 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:33 crc kubenswrapper[4747]: I1205 20:59:33.991071 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094841 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094943 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.094990 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.095023 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvl2s\" (UniqueName: \"kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.096706 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.097076 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.097549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.097768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.098123 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.109356 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.122809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.130905 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvl2s\" (UniqueName: \"kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.156710 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.258965 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 20:59:34 crc kubenswrapper[4747]: I1205 20:59:34.482869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerStarted","Data":"b9fd05f438e03f7aacb5322db9f9e8ef18159d42646955d6467db61ef333dc34"} Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.149413 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.151135 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.155914 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.155942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.156373 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-b588j" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.156570 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.157228 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212809 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212836 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq5gw\" (UniqueName: \"kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212899 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212915 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.212956 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314753 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314776 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314802 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314839 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.314911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq5gw\" (UniqueName: \"kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.316168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.316896 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.317711 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.318229 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.319829 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.322382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.327596 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.333491 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq5gw\" (UniqueName: \"kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.350180 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.467490 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.594599 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.596346 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.600410 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.600709 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.608928 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-w79df" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.648575 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.648630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7x9q\" (UniqueName: \"kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.648685 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.648699 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.648738 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.651359 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.750267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.750309 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7x9q\" (UniqueName: \"kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.750359 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.750375 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.750413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.751396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.751882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.758180 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.758298 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.767679 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7x9q\" (UniqueName: \"kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q\") pod \"memcached-0\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " pod="openstack/memcached-0" Dec 05 20:59:35 crc kubenswrapper[4747]: I1205 20:59:35.978513 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.095525 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.096959 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.099007 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4ccc9" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.105861 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.186235 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv789\" (UniqueName: \"kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789\") pod \"kube-state-metrics-0\" (UID: \"29db5b68-f488-4071-9794-a6634a0a301f\") " pod="openstack/kube-state-metrics-0" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.288023 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv789\" (UniqueName: \"kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789\") pod \"kube-state-metrics-0\" (UID: \"29db5b68-f488-4071-9794-a6634a0a301f\") " pod="openstack/kube-state-metrics-0" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.316761 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv789\" (UniqueName: \"kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789\") pod \"kube-state-metrics-0\" (UID: \"29db5b68-f488-4071-9794-a6634a0a301f\") " pod="openstack/kube-state-metrics-0" Dec 05 20:59:37 crc kubenswrapper[4747]: I1205 20:59:37.412521 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.778324 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.780371 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.783495 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-d6285" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.783555 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.786162 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.787218 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.793313 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.814514 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.821198 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.859769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860056 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9tt\" (UniqueName: \"kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860146 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860188 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860212 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhm8\" (UniqueName: \"kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860352 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860482 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860520 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860564 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860658 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860763 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.860856 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.962998 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963083 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963228 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963279 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9tt\" (UniqueName: \"kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963327 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhm8\" (UniqueName: \"kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963476 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963523 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963565 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.963980 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.964088 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.964488 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.965371 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.965660 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.968073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.968148 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.968456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.968517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.970420 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.979520 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.990053 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9tt\" (UniqueName: \"kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt\") pod \"ovn-controller-ovs-4d2dp\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:40 crc kubenswrapper[4747]: I1205 20:59:40.990055 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhm8\" (UniqueName: \"kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8\") pod \"ovn-controller-ns2k6\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.109676 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.120380 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.304308 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.305781 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.309029 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.309358 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.309497 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.309643 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2z2mb" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.309750 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.318619 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473690 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473735 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473765 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473798 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473867 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473887 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.473903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tnbg\" (UniqueName: \"kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tnbg\" (UniqueName: \"kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575378 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575407 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.575424 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.576384 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.576398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.576610 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.576832 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.579808 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.579893 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.579989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.594145 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tnbg\" (UniqueName: \"kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.611891 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:41 crc kubenswrapper[4747]: I1205 20:59:41.631875 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.553094 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.555901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.561857 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kg7tb" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.562011 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.561859 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.562137 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.567276 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.650889 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.650953 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.650993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.651010 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.651028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.651072 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.651377 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvtw\" (UniqueName: \"kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.651528 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753224 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753276 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753370 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvtw\" (UniqueName: \"kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.753882 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.754306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.755087 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.755786 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.760364 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.761489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.762726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.774709 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvtw\" (UniqueName: \"kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.792367 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:44 crc kubenswrapper[4747]: I1205 20:59:44.876847 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 20:59:50 crc kubenswrapper[4747]: I1205 20:59:50.915785 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.254800 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.255267 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgvjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-dxcxj_openstack(4b8b86f0-0f9d-45cc-9a6d-73b2704afe61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.257122 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-dxcxj" podUID="4b8b86f0-0f9d-45cc-9a6d-73b2704afe61" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.263131 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.263298 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnhww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-ts844_openstack(4b728720-52af-4c9a-b411-e64b9d0b8249): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.264645 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" podUID="4b728720-52af-4c9a-b411-e64b9d0b8249" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.285637 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.285777 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbhkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cb666b895-kbblh_openstack(ffe619cc-656e-4967-8a21-aaea4560c00b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.286989 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cb666b895-kbblh" podUID="ffe619cc-656e-4967-8a21-aaea4560c00b" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.306883 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.307052 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t58sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-859d485f47-h9ph7_openstack(62c2f197-894b-471b-a5c1-968b7b951427): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.308310 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" podUID="62c2f197-894b-471b-a5c1-968b7b951427" Dec 05 20:59:52 crc kubenswrapper[4747]: I1205 20:59:52.637810 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerStarted","Data":"9352a6f64d89b762559f344694b4b0d367d9deef68a621a56dcc9ba2bb269208"} Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.639867 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-cb666b895-kbblh" podUID="ffe619cc-656e-4967-8a21-aaea4560c00b" Dec 05 20:59:52 crc kubenswrapper[4747]: E1205 20:59:52.639864 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" podUID="62c2f197-894b-471b-a5c1-968b7b951427" Dec 05 20:59:52 crc kubenswrapper[4747]: I1205 20:59:52.753038 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.047711 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.054837 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.096865 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.179919 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 20:59:53 crc kubenswrapper[4747]: W1205 20:59:53.190322 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684a964a_1f18_4648_b8c7_6c8c818e16bd.slice/crio-ab962482f76935b4502ee2790ca852a89f44724a1702720ed7d26e2bddab3487 WatchSource:0}: Error finding container ab962482f76935b4502ee2790ca852a89f44724a1702720ed7d26e2bddab3487: Status 404 returned error can't find the container with id ab962482f76935b4502ee2790ca852a89f44724a1702720ed7d26e2bddab3487 Dec 05 20:59:53 crc kubenswrapper[4747]: W1205 20:59:53.194882 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03e0ca3b_083d_477b_a227_dc70e5204444.slice/crio-2b2ea54234df653259ae3f1cf08da1e167803b15d41d767e0a0595025d6d609c WatchSource:0}: Error finding container 2b2ea54234df653259ae3f1cf08da1e167803b15d41d767e0a0595025d6d609c: Status 404 returned error can't find the container with id 2b2ea54234df653259ae3f1cf08da1e167803b15d41d767e0a0595025d6d609c Dec 05 20:59:53 crc kubenswrapper[4747]: W1205 20:59:53.196476 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d835978_9804_4427_9dd4_48c40ad829c5.slice/crio-bd28baee269e8d9a60f943abf8a78a62b371dd3c28051c376c73adbefc3a406d WatchSource:0}: Error finding container bd28baee269e8d9a60f943abf8a78a62b371dd3c28051c376c73adbefc3a406d: Status 404 returned error can't find the container with id bd28baee269e8d9a60f943abf8a78a62b371dd3c28051c376c73adbefc3a406d Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.282622 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 20:59:53 crc kubenswrapper[4747]: W1205 20:59:53.285465 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37900c0_058e_4be6_9b81_c67d5f05b7a5.slice/crio-f900224ff5a45a57a4549589617ac9b7019147b59989ea6afbc7ae5b729b3a2c WatchSource:0}: Error finding container f900224ff5a45a57a4549589617ac9b7019147b59989ea6afbc7ae5b729b3a2c: Status 404 returned error can't find the container with id f900224ff5a45a57a4549589617ac9b7019147b59989ea6afbc7ae5b729b3a2c Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.457127 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.465060 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.517862 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgvjq\" (UniqueName: \"kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq\") pod \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.517989 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc\") pod \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.518238 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config\") pod \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\" (UID: \"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61\") " Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.518745 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61" (UID: "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.518754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config" (OuterVolumeSpecName: "config") pod "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61" (UID: "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.580195 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq" (OuterVolumeSpecName: "kube-api-access-qgvjq") pod "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61" (UID: "4b8b86f0-0f9d-45cc-9a6d-73b2704afe61"). InnerVolumeSpecName "kube-api-access-qgvjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.619356 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnhww\" (UniqueName: \"kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww\") pod \"4b728720-52af-4c9a-b411-e64b9d0b8249\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.619474 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config\") pod \"4b728720-52af-4c9a-b411-e64b9d0b8249\" (UID: \"4b728720-52af-4c9a-b411-e64b9d0b8249\") " Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.619891 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgvjq\" (UniqueName: \"kubernetes.io/projected/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-kube-api-access-qgvjq\") on node \"crc\" DevicePath \"\"" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.619910 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.619921 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.620602 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config" (OuterVolumeSpecName: "config") pod "4b728720-52af-4c9a-b411-e64b9d0b8249" (UID: "4b728720-52af-4c9a-b411-e64b9d0b8249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.645300 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww" (OuterVolumeSpecName: "kube-api-access-tnhww") pod "4b728720-52af-4c9a-b411-e64b9d0b8249" (UID: "4b728720-52af-4c9a-b411-e64b9d0b8249"). InnerVolumeSpecName "kube-api-access-tnhww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.645798 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" event={"ID":"4b728720-52af-4c9a-b411-e64b9d0b8249","Type":"ContainerDied","Data":"0f49f3a454653de1ebf85f24b7e08d87c8b4647e7a1b86ee622f50945c581c29"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.645844 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-ts844" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.647652 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-dxcxj" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.647668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-dxcxj" event={"ID":"4b8b86f0-0f9d-45cc-9a6d-73b2704afe61","Type":"ContainerDied","Data":"2036b9b808827a4b879b8850829bdde9816935a7d45948926dc423cf1eef5133"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.648936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerStarted","Data":"bd28baee269e8d9a60f943abf8a78a62b371dd3c28051c376c73adbefc3a406d"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.649811 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerStarted","Data":"ab962482f76935b4502ee2790ca852a89f44724a1702720ed7d26e2bddab3487"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.650991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerStarted","Data":"f900224ff5a45a57a4549589617ac9b7019147b59989ea6afbc7ae5b729b3a2c"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.652543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29db5b68-f488-4071-9794-a6634a0a301f","Type":"ContainerStarted","Data":"e1510622689c7f0e4277052337b1437057484928c8596d03b7e27789695a709c"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.654924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6" event={"ID":"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4","Type":"ContainerStarted","Data":"d54e16885314b407a47de012ca616caef1edcd1f5b8b60a9f6e5eb197deb1d7a"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.656801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerStarted","Data":"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.658873 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03e0ca3b-083d-477b-a227-dc70e5204444","Type":"ContainerStarted","Data":"2b2ea54234df653259ae3f1cf08da1e167803b15d41d767e0a0595025d6d609c"} Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.724997 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnhww\" (UniqueName: \"kubernetes.io/projected/4b728720-52af-4c9a-b411-e64b9d0b8249-kube-api-access-tnhww\") on node \"crc\" DevicePath \"\"" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.726328 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b728720-52af-4c9a-b411-e64b9d0b8249-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.729066 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.734414 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-ts844"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.760283 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.765568 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-dxcxj"] Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.851210 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b728720-52af-4c9a-b411-e64b9d0b8249" path="/var/lib/kubelet/pods/4b728720-52af-4c9a-b411-e64b9d0b8249/volumes" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.851548 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8b86f0-0f9d-45cc-9a6d-73b2704afe61" path="/var/lib/kubelet/pods/4b8b86f0-0f9d-45cc-9a6d-73b2704afe61/volumes" Dec 05 20:59:53 crc kubenswrapper[4747]: I1205 20:59:53.921244 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 20:59:54 crc kubenswrapper[4747]: I1205 20:59:54.675086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerStarted","Data":"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3"} Dec 05 20:59:54 crc kubenswrapper[4747]: I1205 20:59:54.676380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerStarted","Data":"0b7f872b85ae4681314c080e585d2933ebe9cf9bf437dd3d7c82642455a65b7b"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.718109 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerStarted","Data":"a88a3625221a5fdb83dd406f3af85bff0a37692424ce8b21edd35f8c6077f232"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.723992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerStarted","Data":"0e2fa2cf0521e9e0a9ee440e9496bbd60e565938c9bcb968eb76b83f8a5212ad"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.725234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03e0ca3b-083d-477b-a227-dc70e5204444","Type":"ContainerStarted","Data":"2cb80132fac67dd10f92c4e06ec0d58fbd16b1a14243d5a1da9696abdd3d83b7"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.725704 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.726575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerStarted","Data":"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.730634 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerStarted","Data":"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.736553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerStarted","Data":"a49cc62b0001176f106cecb1bb2d14cea929220be3afa1a039e1dab6d767683e"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.741945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29db5b68-f488-4071-9794-a6634a0a301f","Type":"ContainerStarted","Data":"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd"} Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.742548 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.782845 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.233754929 podStartE2EDuration="24.782830447s" podCreationTimestamp="2025-12-05 20:59:35 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.197485604 +0000 UTC m=+1063.664793092" lastFinishedPulling="2025-12-05 20:59:58.746561122 +0000 UTC m=+1069.213868610" observedRunningTime="2025-12-05 20:59:59.777204597 +0000 UTC m=+1070.244512085" watchObservedRunningTime="2025-12-05 20:59:59.782830447 +0000 UTC m=+1070.250137935" Dec 05 20:59:59 crc kubenswrapper[4747]: I1205 20:59:59.814172 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.557351494 podStartE2EDuration="22.814155505s" podCreationTimestamp="2025-12-05 20:59:37 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.187906966 +0000 UTC m=+1063.655214464" lastFinishedPulling="2025-12-05 20:59:59.444710987 +0000 UTC m=+1069.912018475" observedRunningTime="2025-12-05 20:59:59.812308779 +0000 UTC m=+1070.279616267" watchObservedRunningTime="2025-12-05 20:59:59.814155505 +0000 UTC m=+1070.281462993" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.163733 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4"] Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.165362 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.169287 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.169609 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.176339 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4"] Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.255440 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknjh\" (UniqueName: \"kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.255549 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.255591 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.357445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.357493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.357572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknjh\" (UniqueName: \"kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.358572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.367395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.374964 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknjh\" (UniqueName: \"kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh\") pod \"collect-profiles-29416140-bksd4\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.491766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.753448 4747 generic.go:334] "Generic (PLEG): container finished" podID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerID="a49cc62b0001176f106cecb1bb2d14cea929220be3afa1a039e1dab6d767683e" exitCode=0 Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.753608 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerDied","Data":"a49cc62b0001176f106cecb1bb2d14cea929220be3afa1a039e1dab6d767683e"} Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.759857 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6" event={"ID":"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4","Type":"ContainerStarted","Data":"0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827"} Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.795822 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ns2k6" podStartSLOduration=14.887170281 podStartE2EDuration="20.795804342s" podCreationTimestamp="2025-12-05 20:59:40 +0000 UTC" firstStartedPulling="2025-12-05 20:59:52.759645777 +0000 UTC m=+1063.226953265" lastFinishedPulling="2025-12-05 20:59:58.668279838 +0000 UTC m=+1069.135587326" observedRunningTime="2025-12-05 21:00:00.790168292 +0000 UTC m=+1071.257475770" watchObservedRunningTime="2025-12-05 21:00:00.795804342 +0000 UTC m=+1071.263111830" Dec 05 21:00:00 crc kubenswrapper[4747]: I1205 21:00:00.934670 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4"] Dec 05 21:00:01 crc kubenswrapper[4747]: I1205 21:00:01.120716 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ns2k6" Dec 05 21:00:01 crc kubenswrapper[4747]: I1205 21:00:01.770519 4747 generic.go:334] "Generic (PLEG): container finished" podID="a040d587-9981-428f-baed-4ad3d3b2ee55" containerID="1b5b4f35f9829a20a7db089517598a56b6654ddd04f79022b9d16e99739c9335" exitCode=0 Dec 05 21:00:01 crc kubenswrapper[4747]: I1205 21:00:01.771013 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" event={"ID":"a040d587-9981-428f-baed-4ad3d3b2ee55","Type":"ContainerDied","Data":"1b5b4f35f9829a20a7db089517598a56b6654ddd04f79022b9d16e99739c9335"} Dec 05 21:00:01 crc kubenswrapper[4747]: I1205 21:00:01.771043 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" event={"ID":"a040d587-9981-428f-baed-4ad3d3b2ee55","Type":"ContainerStarted","Data":"383f0b882678eca3c4f8921ad609cc3fe5d18b8f6d4dfcfa71db8fdd276acbe9"} Dec 05 21:00:01 crc kubenswrapper[4747]: I1205 21:00:01.774371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerStarted","Data":"5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6"} Dec 05 21:00:02 crc kubenswrapper[4747]: I1205 21:00:02.786412 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerStarted","Data":"c2b5ad7c71ac902cf882c6a823d0b022b018b3f0655877bb3292fa03d68e3bae"} Dec 05 21:00:02 crc kubenswrapper[4747]: I1205 21:00:02.789522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerStarted","Data":"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430"} Dec 05 21:00:02 crc kubenswrapper[4747]: I1205 21:00:02.816946 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.461203056 podStartE2EDuration="22.816926104s" podCreationTimestamp="2025-12-05 20:59:40 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.20094593 +0000 UTC m=+1063.668253418" lastFinishedPulling="2025-12-05 21:00:02.556668978 +0000 UTC m=+1073.023976466" observedRunningTime="2025-12-05 21:00:02.809204932 +0000 UTC m=+1073.276512420" watchObservedRunningTime="2025-12-05 21:00:02.816926104 +0000 UTC m=+1073.284233592" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.099307 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.211641 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume\") pod \"a040d587-9981-428f-baed-4ad3d3b2ee55\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.211743 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume\") pod \"a040d587-9981-428f-baed-4ad3d3b2ee55\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.211764 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknjh\" (UniqueName: \"kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh\") pod \"a040d587-9981-428f-baed-4ad3d3b2ee55\" (UID: \"a040d587-9981-428f-baed-4ad3d3b2ee55\") " Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.212526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume" (OuterVolumeSpecName: "config-volume") pod "a040d587-9981-428f-baed-4ad3d3b2ee55" (UID: "a040d587-9981-428f-baed-4ad3d3b2ee55"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.216761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a040d587-9981-428f-baed-4ad3d3b2ee55" (UID: "a040d587-9981-428f-baed-4ad3d3b2ee55"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.216902 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh" (OuterVolumeSpecName: "kube-api-access-hknjh") pod "a040d587-9981-428f-baed-4ad3d3b2ee55" (UID: "a040d587-9981-428f-baed-4ad3d3b2ee55"). InnerVolumeSpecName "kube-api-access-hknjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.313981 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a040d587-9981-428f-baed-4ad3d3b2ee55-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.314015 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hknjh\" (UniqueName: \"kubernetes.io/projected/a040d587-9981-428f-baed-4ad3d3b2ee55-kube-api-access-hknjh\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.314024 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a040d587-9981-428f-baed-4ad3d3b2ee55-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.804007 4747 generic.go:334] "Generic (PLEG): container finished" podID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerID="cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3" exitCode=0 Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.804137 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerDied","Data":"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3"} Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.808151 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerStarted","Data":"ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8"} Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.808244 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.808325 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.809973 4747 generic.go:334] "Generic (PLEG): container finished" podID="53f0706f-507e-4360-b83e-9dbfacee3144" containerID="a88a3625221a5fdb83dd406f3af85bff0a37692424ce8b21edd35f8c6077f232" exitCode=0 Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.810037 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerDied","Data":"a88a3625221a5fdb83dd406f3af85bff0a37692424ce8b21edd35f8c6077f232"} Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.811935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" event={"ID":"a040d587-9981-428f-baed-4ad3d3b2ee55","Type":"ContainerDied","Data":"383f0b882678eca3c4f8921ad609cc3fe5d18b8f6d4dfcfa71db8fdd276acbe9"} Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.811998 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="383f0b882678eca3c4f8921ad609cc3fe5d18b8f6d4dfcfa71db8fdd276acbe9" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.811953 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.870488 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.320805274 podStartE2EDuration="20.870471127s" podCreationTimestamp="2025-12-05 20:59:43 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.988857134 +0000 UTC m=+1064.456164622" lastFinishedPulling="2025-12-05 21:00:02.538522987 +0000 UTC m=+1073.005830475" observedRunningTime="2025-12-05 21:00:03.846395319 +0000 UTC m=+1074.313702807" watchObservedRunningTime="2025-12-05 21:00:03.870471127 +0000 UTC m=+1074.337778615" Dec 05 21:00:03 crc kubenswrapper[4747]: I1205 21:00:03.895900 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4d2dp" podStartSLOduration=18.578345283 podStartE2EDuration="23.895857108s" podCreationTimestamp="2025-12-05 20:59:40 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.28786915 +0000 UTC m=+1063.755176638" lastFinishedPulling="2025-12-05 20:59:58.605380965 +0000 UTC m=+1069.072688463" observedRunningTime="2025-12-05 21:00:03.891394487 +0000 UTC m=+1074.358701995" watchObservedRunningTime="2025-12-05 21:00:03.895857108 +0000 UTC m=+1074.363164596" Dec 05 21:00:04 crc kubenswrapper[4747]: I1205 21:00:04.824493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerStarted","Data":"911dc2af5532c982dccb86d6364ac90a2fcc902c0d91f743b41d97c4cf784943"} Dec 05 21:00:04 crc kubenswrapper[4747]: I1205 21:00:04.828087 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerStarted","Data":"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421"} Dec 05 21:00:04 crc kubenswrapper[4747]: I1205 21:00:04.850076 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.36694157 podStartE2EDuration="30.850055243s" podCreationTimestamp="2025-12-05 20:59:34 +0000 UTC" firstStartedPulling="2025-12-05 20:59:52.263511971 +0000 UTC m=+1062.730819449" lastFinishedPulling="2025-12-05 20:59:58.746625634 +0000 UTC m=+1069.213933122" observedRunningTime="2025-12-05 21:00:04.846497855 +0000 UTC m=+1075.313805363" watchObservedRunningTime="2025-12-05 21:00:04.850055243 +0000 UTC m=+1075.317362751" Dec 05 21:00:04 crc kubenswrapper[4747]: I1205 21:00:04.873658 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.714423893 podStartE2EDuration="32.873636809s" podCreationTimestamp="2025-12-05 20:59:32 +0000 UTC" firstStartedPulling="2025-12-05 20:59:53.192008428 +0000 UTC m=+1063.659315916" lastFinishedPulling="2025-12-05 20:59:59.351221354 +0000 UTC m=+1069.818528832" observedRunningTime="2025-12-05 21:00:04.868241695 +0000 UTC m=+1075.335549193" watchObservedRunningTime="2025-12-05 21:00:04.873636809 +0000 UTC m=+1075.340944307" Dec 05 21:00:04 crc kubenswrapper[4747]: I1205 21:00:04.878141 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.468967 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.469016 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.633004 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.698724 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.838125 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.877734 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.893610 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.932234 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 21:00:05 crc kubenswrapper[4747]: I1205 21:00:05.979738 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.097266 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.139655 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:06 crc kubenswrapper[4747]: E1205 21:00:06.140310 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a040d587-9981-428f-baed-4ad3d3b2ee55" containerName="collect-profiles" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.140432 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a040d587-9981-428f-baed-4ad3d3b2ee55" containerName="collect-profiles" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.140729 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a040d587-9981-428f-baed-4ad3d3b2ee55" containerName="collect-profiles" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.141842 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.145043 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.161701 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.215412 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.216570 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.220541 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.224784 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.275988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.276047 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dzmm\" (UniqueName: \"kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.276155 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.276228 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378012 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378333 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzrm\" (UniqueName: \"kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378423 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378466 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378486 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dzmm\" (UniqueName: \"kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.378560 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.379349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.379467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.379921 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.410100 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dzmm\" (UniqueName: \"kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm\") pod \"dnsmasq-dns-57db9b5bc9-ns7n8\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.468852 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480424 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzrm\" (UniqueName: \"kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480459 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480490 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480513 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480534 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.480956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.481008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.481444 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.484333 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.487932 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.503160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzrm\" (UniqueName: \"kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm\") pod \"ovn-controller-metrics-5jh4k\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.532348 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.566564 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.602432 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.611992 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.619698 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.619936 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.788367 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.788446 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.788484 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.788501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzlr\" (UniqueName: \"kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.788532 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.849516 4747 generic.go:334] "Generic (PLEG): container finished" podID="62c2f197-894b-471b-a5c1-968b7b951427" containerID="215c96cffea4155a12f096a5fe2ca80e316fd9ba43904fa2725bf11432c97fa5" exitCode=0 Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.850631 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" event={"ID":"62c2f197-894b-471b-a5c1-968b7b951427","Type":"ContainerDied","Data":"215c96cffea4155a12f096a5fe2ca80e316fd9ba43904fa2725bf11432c97fa5"} Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.890403 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.890488 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.890531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.890556 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzlr\" (UniqueName: \"kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.890610 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.891200 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.891636 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.891726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.892243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.895341 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.910699 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzlr\" (UniqueName: \"kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr\") pod \"dnsmasq-dns-db7757ddc-fdjgh\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.941098 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 21:00:06 crc kubenswrapper[4747]: I1205 21:00:06.946504 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.067626 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.083005 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.097910 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc\") pod \"ffe619cc-656e-4967-8a21-aaea4560c00b\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.098096 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config\") pod \"ffe619cc-656e-4967-8a21-aaea4560c00b\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.098182 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbhkv\" (UniqueName: \"kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv\") pod \"ffe619cc-656e-4967-8a21-aaea4560c00b\" (UID: \"ffe619cc-656e-4967-8a21-aaea4560c00b\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.104402 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.106294 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.106550 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config" (OuterVolumeSpecName: "config") pod "ffe619cc-656e-4967-8a21-aaea4560c00b" (UID: "ffe619cc-656e-4967-8a21-aaea4560c00b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.106733 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.106840 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.106995 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g27xf" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.108922 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffe619cc-656e-4967-8a21-aaea4560c00b" (UID: "ffe619cc-656e-4967-8a21-aaea4560c00b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.134975 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv" (OuterVolumeSpecName: "kube-api-access-mbhkv") pod "ffe619cc-656e-4967-8a21-aaea4560c00b" (UID: "ffe619cc-656e-4967-8a21-aaea4560c00b"). InnerVolumeSpecName "kube-api-access-mbhkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.148694 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200628 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200674 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200720 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ppsc\" (UniqueName: \"kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200756 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200775 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200830 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200841 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe619cc-656e-4967-8a21-aaea4560c00b-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.200849 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbhkv\" (UniqueName: \"kubernetes.io/projected/ffe619cc-656e-4967-8a21-aaea4560c00b-kube-api-access-mbhkv\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.240218 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302697 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ppsc\" (UniqueName: \"kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302808 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.302824 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.307861 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.308014 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.308691 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.312606 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.314447 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.318879 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.325183 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.340392 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ppsc\" (UniqueName: \"kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc\") pod \"ovn-northd-0\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.409924 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config\") pod \"62c2f197-894b-471b-a5c1-968b7b951427\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.410007 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t58sx\" (UniqueName: \"kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx\") pod \"62c2f197-894b-471b-a5c1-968b7b951427\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.410028 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc\") pod \"62c2f197-894b-471b-a5c1-968b7b951427\" (UID: \"62c2f197-894b-471b-a5c1-968b7b951427\") " Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.445752 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx" (OuterVolumeSpecName: "kube-api-access-t58sx") pod "62c2f197-894b-471b-a5c1-968b7b951427" (UID: "62c2f197-894b-471b-a5c1-968b7b951427"). InnerVolumeSpecName "kube-api-access-t58sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.449599 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.450773 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.452715 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config" (OuterVolumeSpecName: "config") pod "62c2f197-894b-471b-a5c1-968b7b951427" (UID: "62c2f197-894b-471b-a5c1-968b7b951427"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.485865 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.535736 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.535774 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t58sx\" (UniqueName: \"kubernetes.io/projected/62c2f197-894b-471b-a5c1-968b7b951427-kube-api-access-t58sx\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.538100 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62c2f197-894b-471b-a5c1-968b7b951427" (UID: "62c2f197-894b-471b-a5c1-968b7b951427"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.561654 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:00:07 crc kubenswrapper[4747]: E1205 21:00:07.562047 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c2f197-894b-471b-a5c1-968b7b951427" containerName="init" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.562062 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c2f197-894b-471b-a5c1-968b7b951427" containerName="init" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.562483 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c2f197-894b-471b-a5c1-968b7b951427" containerName="init" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.563570 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.597236 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.623252 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.637769 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c2f197-894b-471b-a5c1-968b7b951427-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.738806 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.739134 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.739179 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.739247 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrmlw\" (UniqueName: \"kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.739355 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.840605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.840664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.840701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.840732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.840759 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrmlw\" (UniqueName: \"kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.841519 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.841836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.842128 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.842850 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.859369 4747 generic.go:334] "Generic (PLEG): container finished" podID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerID="8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b" exitCode=0 Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.859443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" event={"ID":"54d24deb-7268-4bb5-818e-8cf514cbe5b1","Type":"ContainerDied","Data":"8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.859469 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" event={"ID":"54d24deb-7268-4bb5-818e-8cf514cbe5b1","Type":"ContainerStarted","Data":"7fd20db884d6d20840dbe627b71bd4c4b18f26b36ce48eb097197236f287687c"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.861280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrmlw\" (UniqueName: \"kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw\") pod \"dnsmasq-dns-59d5fbdd8c-4bczm\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.861893 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" event={"ID":"62c2f197-894b-471b-a5c1-968b7b951427","Type":"ContainerDied","Data":"51aa775ac39a39850cae5e970ca54ca1584cc57f0f0adfa5e2ee34537c9183c4"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.861937 4747 scope.go:117] "RemoveContainer" containerID="215c96cffea4155a12f096a5fe2ca80e316fd9ba43904fa2725bf11432c97fa5" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.862053 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-859d485f47-h9ph7" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.872827 4747 generic.go:334] "Generic (PLEG): container finished" podID="a5c9492d-3f1b-47b9-b70f-fe5db9393c72" containerID="22e8067a16c8496d664fad360e982b918421661a4c84d61472845362951169c1" exitCode=0 Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.872920 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" event={"ID":"a5c9492d-3f1b-47b9-b70f-fe5db9393c72","Type":"ContainerDied","Data":"22e8067a16c8496d664fad360e982b918421661a4c84d61472845362951169c1"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.872945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" event={"ID":"a5c9492d-3f1b-47b9-b70f-fe5db9393c72","Type":"ContainerStarted","Data":"eaec9cb05f0d45f50feb4a7aaacfcf3eeb200a5350c7e83fe894b6b5845f264c"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.880019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-kbblh" event={"ID":"ffe619cc-656e-4967-8a21-aaea4560c00b","Type":"ContainerDied","Data":"11d1809d33ade7b69410b2afb4a8ecc60b43b07ad949e510980dd28e34297002"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.880134 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-kbblh" Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.885805 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5jh4k" event={"ID":"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5","Type":"ContainerStarted","Data":"5e62b0a320297c30e0688db6c5b330d66aadcb57092284da879891e956764e37"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.885866 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5jh4k" event={"ID":"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5","Type":"ContainerStarted","Data":"088d5f09d3868f9a5e726a677de7de1455e6e908ad3a39b66769c1544d4587df"} Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.957790 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.964055 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-859d485f47-h9ph7"] Dec 05 21:00:07 crc kubenswrapper[4747]: I1205 21:00:07.992611 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.001989 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-kbblh"] Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.013695 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.013655 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5jh4k" podStartSLOduration=2.013631168 podStartE2EDuration="2.013631168s" podCreationTimestamp="2025-12-05 21:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:07.969971673 +0000 UTC m=+1078.437279161" watchObservedRunningTime="2025-12-05 21:00:08.013631168 +0000 UTC m=+1078.480938676" Dec 05 21:00:08 crc kubenswrapper[4747]: W1205 21:00:08.029381 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7528af7d_8d58_4f20_ac03_b9b62e14c9e2.slice/crio-92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909 WatchSource:0}: Error finding container 92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909: Status 404 returned error can't find the container with id 92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909 Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.031425 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.129782 4747 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 05 21:00:08 crc kubenswrapper[4747]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/54d24deb-7268-4bb5-818e-8cf514cbe5b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 21:00:08 crc kubenswrapper[4747]: > podSandboxID="7fd20db884d6d20840dbe627b71bd4c4b18f26b36ce48eb097197236f287687c" Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.129935 4747 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 05 21:00:08 crc kubenswrapper[4747]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzzlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-db7757ddc-fdjgh_openstack(54d24deb-7268-4bb5-818e-8cf514cbe5b1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/54d24deb-7268-4bb5-818e-8cf514cbe5b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 21:00:08 crc kubenswrapper[4747]: > logger="UnhandledError" Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.131107 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/54d24deb-7268-4bb5-818e-8cf514cbe5b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.201370 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.353279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dzmm\" (UniqueName: \"kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm\") pod \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.353546 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config\") pod \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.353739 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb\") pod \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.353786 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc\") pod \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\" (UID: \"a5c9492d-3f1b-47b9-b70f-fe5db9393c72\") " Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.359485 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm" (OuterVolumeSpecName: "kube-api-access-5dzmm") pod "a5c9492d-3f1b-47b9-b70f-fe5db9393c72" (UID: "a5c9492d-3f1b-47b9-b70f-fe5db9393c72"). InnerVolumeSpecName "kube-api-access-5dzmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.379850 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5c9492d-3f1b-47b9-b70f-fe5db9393c72" (UID: "a5c9492d-3f1b-47b9-b70f-fe5db9393c72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.380877 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config" (OuterVolumeSpecName: "config") pod "a5c9492d-3f1b-47b9-b70f-fe5db9393c72" (UID: "a5c9492d-3f1b-47b9-b70f-fe5db9393c72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.391944 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a5c9492d-3f1b-47b9-b70f-fe5db9393c72" (UID: "a5c9492d-3f1b-47b9-b70f-fe5db9393c72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.455616 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.455647 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.455657 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dzmm\" (UniqueName: \"kubernetes.io/projected/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-kube-api-access-5dzmm\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.455665 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c9492d-3f1b-47b9-b70f-fe5db9393c72-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.513990 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.636503 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.637280 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c9492d-3f1b-47b9-b70f-fe5db9393c72" containerName="init" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.637303 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c9492d-3f1b-47b9-b70f-fe5db9393c72" containerName="init" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.637744 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c9492d-3f1b-47b9-b70f-fe5db9393c72" containerName="init" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.654399 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.659657 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.659851 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.659940 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.659853 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2qmzv" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.662514 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.759880 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.759940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.759973 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrqr\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.759993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.760028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862153 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrqr\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862198 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862348 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862699 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.862977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.863204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.863381 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.863409 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 21:00:08 crc kubenswrapper[4747]: E1205 21:00:08.863479 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift podName:5f06375c-a008-4d1f-b2ae-516549bcd438 nodeName:}" failed. No retries permitted until 2025-12-05 21:00:09.36345672 +0000 UTC m=+1079.830764228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift") pod "swift-storage-0" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438") : configmap "swift-ring-files" not found Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.878638 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrqr\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.886567 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.893492 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerStarted","Data":"92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909"} Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.896890 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.897696 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-ns7n8" event={"ID":"a5c9492d-3f1b-47b9-b70f-fe5db9393c72","Type":"ContainerDied","Data":"eaec9cb05f0d45f50feb4a7aaacfcf3eeb200a5350c7e83fe894b6b5845f264c"} Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.897757 4747 scope.go:117] "RemoveContainer" containerID="22e8067a16c8496d664fad360e982b918421661a4c84d61472845362951169c1" Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.900664 4747 generic.go:334] "Generic (PLEG): container finished" podID="500ba135-ea9d-42ba-adb5-487f1646368f" containerID="b7b95c8125fcfad858bf386829e6afe787b7e305d060a024769623fea4c00b5e" exitCode=0 Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.900745 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" event={"ID":"500ba135-ea9d-42ba-adb5-487f1646368f","Type":"ContainerDied","Data":"b7b95c8125fcfad858bf386829e6afe787b7e305d060a024769623fea4c00b5e"} Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.900812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" event={"ID":"500ba135-ea9d-42ba-adb5-487f1646368f","Type":"ContainerStarted","Data":"710db5b63b4d3ef43e6c81d44b51ea939a0b48c9f2519e871f3119dc5f8a501a"} Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.983934 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:08 crc kubenswrapper[4747]: I1205 21:00:08.990989 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-ns7n8"] Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.115140 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2ns9l"] Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.117686 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.119655 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.120823 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.124180 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2ns9l"] Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.126830 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.270769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271299 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.271398 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b67q\" (UniqueName: \"kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372481 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372653 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372722 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372758 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372791 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.372837 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b67q\" (UniqueName: \"kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: E1205 21:00:09.373107 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:00:09 crc kubenswrapper[4747]: E1205 21:00:09.373149 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 21:00:09 crc kubenswrapper[4747]: E1205 21:00:09.373218 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift podName:5f06375c-a008-4d1f-b2ae-516549bcd438 nodeName:}" failed. No retries permitted until 2025-12-05 21:00:10.373197444 +0000 UTC m=+1080.840504942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift") pod "swift-storage-0" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438") : configmap "swift-ring-files" not found Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.373485 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.373681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.373992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.376670 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.377162 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.377454 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.391020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b67q\" (UniqueName: \"kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q\") pod \"swift-ring-rebalance-2ns9l\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.444177 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.736900 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2ns9l"] Dec 05 21:00:09 crc kubenswrapper[4747]: W1205 21:00:09.743571 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod749c2c93_a078_47ab_b6f9_673cef723e20.slice/crio-9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852 WatchSource:0}: Error finding container 9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852: Status 404 returned error can't find the container with id 9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852 Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.853995 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c2f197-894b-471b-a5c1-968b7b951427" path="/var/lib/kubelet/pods/62c2f197-894b-471b-a5c1-968b7b951427/volumes" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.854596 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c9492d-3f1b-47b9-b70f-fe5db9393c72" path="/var/lib/kubelet/pods/a5c9492d-3f1b-47b9-b70f-fe5db9393c72/volumes" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.855218 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe619cc-656e-4967-8a21-aaea4560c00b" path="/var/lib/kubelet/pods/ffe619cc-656e-4967-8a21-aaea4560c00b/volumes" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.922296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerStarted","Data":"a9dcbb54d496caf2f805d60df04f610d71d46e34c98b40703cbee95f6436ad6a"} Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.922483 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerStarted","Data":"264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034"} Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.924058 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.927090 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" event={"ID":"54d24deb-7268-4bb5-818e-8cf514cbe5b1","Type":"ContainerStarted","Data":"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476"} Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.927611 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.935646 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" event={"ID":"500ba135-ea9d-42ba-adb5-487f1646368f","Type":"ContainerStarted","Data":"b62eac0a04a7a445d0be766123c85952bbfb528b8bd03cb53d992450a3a8ac53"} Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.935805 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.937143 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2ns9l" event={"ID":"749c2c93-a078-47ab-b6f9-673cef723e20","Type":"ContainerStarted","Data":"9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852"} Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.947570 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.799687546 podStartE2EDuration="2.947546963s" podCreationTimestamp="2025-12-05 21:00:07 +0000 UTC" firstStartedPulling="2025-12-05 21:00:08.040233568 +0000 UTC m=+1078.507541056" lastFinishedPulling="2025-12-05 21:00:09.188092985 +0000 UTC m=+1079.655400473" observedRunningTime="2025-12-05 21:00:09.941069282 +0000 UTC m=+1080.408376770" watchObservedRunningTime="2025-12-05 21:00:09.947546963 +0000 UTC m=+1080.414854451" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.963791 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" podStartSLOduration=3.963776526 podStartE2EDuration="3.963776526s" podCreationTimestamp="2025-12-05 21:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:09.959225793 +0000 UTC m=+1080.426533291" watchObservedRunningTime="2025-12-05 21:00:09.963776526 +0000 UTC m=+1080.431084014" Dec 05 21:00:09 crc kubenswrapper[4747]: I1205 21:00:09.980186 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" podStartSLOduration=2.980165623 podStartE2EDuration="2.980165623s" podCreationTimestamp="2025-12-05 21:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:09.97478268 +0000 UTC m=+1080.442090168" watchObservedRunningTime="2025-12-05 21:00:09.980165623 +0000 UTC m=+1080.447473121" Dec 05 21:00:10 crc kubenswrapper[4747]: I1205 21:00:10.401903 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:10 crc kubenswrapper[4747]: E1205 21:00:10.402160 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:00:10 crc kubenswrapper[4747]: E1205 21:00:10.402196 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 21:00:10 crc kubenswrapper[4747]: E1205 21:00:10.402267 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift podName:5f06375c-a008-4d1f-b2ae-516549bcd438 nodeName:}" failed. No retries permitted until 2025-12-05 21:00:12.402243999 +0000 UTC m=+1082.869551487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift") pod "swift-storage-0" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438") : configmap "swift-ring-files" not found Dec 05 21:00:11 crc kubenswrapper[4747]: I1205 21:00:11.601299 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 21:00:11 crc kubenswrapper[4747]: I1205 21:00:11.686121 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 21:00:12 crc kubenswrapper[4747]: I1205 21:00:12.442148 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:12 crc kubenswrapper[4747]: E1205 21:00:12.442344 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:00:12 crc kubenswrapper[4747]: E1205 21:00:12.442674 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 21:00:12 crc kubenswrapper[4747]: E1205 21:00:12.442740 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift podName:5f06375c-a008-4d1f-b2ae-516549bcd438 nodeName:}" failed. No retries permitted until 2025-12-05 21:00:16.442719241 +0000 UTC m=+1086.910026739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift") pod "swift-storage-0" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438") : configmap "swift-ring-files" not found Dec 05 21:00:13 crc kubenswrapper[4747]: I1205 21:00:13.996971 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2ns9l" event={"ID":"749c2c93-a078-47ab-b6f9-673cef723e20","Type":"ContainerStarted","Data":"3e2e501bf6db7407c56844788f8bbd62d9e8a7a9e7605f4be76cc81c55d45534"} Dec 05 21:00:14 crc kubenswrapper[4747]: I1205 21:00:14.017170 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2ns9l" podStartSLOduration=1.23924987 podStartE2EDuration="5.017141706s" podCreationTimestamp="2025-12-05 21:00:09 +0000 UTC" firstStartedPulling="2025-12-05 21:00:09.746045297 +0000 UTC m=+1080.213352785" lastFinishedPulling="2025-12-05 21:00:13.523937133 +0000 UTC m=+1083.991244621" observedRunningTime="2025-12-05 21:00:14.014161522 +0000 UTC m=+1084.481469040" watchObservedRunningTime="2025-12-05 21:00:14.017141706 +0000 UTC m=+1084.484449224" Dec 05 21:00:14 crc kubenswrapper[4747]: I1205 21:00:14.260214 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 21:00:14 crc kubenswrapper[4747]: I1205 21:00:14.260296 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 21:00:14 crc kubenswrapper[4747]: I1205 21:00:14.362072 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.079734 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.527734 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7650-account-create-update-qkm7g"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.528969 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.532363 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.539831 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7650-account-create-update-qkm7g"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.571805 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-j2b4b"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.573944 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.584356 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j2b4b"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.621623 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.621739 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfxgr\" (UniqueName: \"kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.723873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.723998 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfxgr\" (UniqueName: \"kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.724164 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.724203 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42r8s\" (UniqueName: \"kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.725132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.760796 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfxgr\" (UniqueName: \"kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr\") pod \"keystone-7650-account-create-update-qkm7g\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.810403 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6w6g9"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.813660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.824751 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6w6g9"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.825694 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.825805 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42r8s\" (UniqueName: \"kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.829355 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.842453 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42r8s\" (UniqueName: \"kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s\") pod \"keystone-db-create-j2b4b\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.867880 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.898012 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8da4-account-create-update-llwr4"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.899216 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.899224 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.901796 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.912069 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8da4-account-create-update-llwr4"] Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.930388 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjd4\" (UniqueName: \"kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:15 crc kubenswrapper[4747]: I1205 21:00:15.930432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.031877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjd4\" (UniqueName: \"kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.032527 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5xf\" (UniqueName: \"kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.032555 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.032688 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.033724 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.057200 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjd4\" (UniqueName: \"kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4\") pod \"placement-db-create-6w6g9\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.131826 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.134051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.134245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5xf\" (UniqueName: \"kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.135046 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mcpnx"] Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.136704 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.136988 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.155244 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mcpnx"] Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.157643 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5xf\" (UniqueName: \"kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf\") pod \"placement-8da4-account-create-update-llwr4\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.237073 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tm6z\" (UniqueName: \"kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.237269 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.263727 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e01d-account-create-update-jw9bz"] Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.265116 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.269984 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.274597 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e01d-account-create-update-jw9bz"] Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.309752 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.339014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.339162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tm6z\" (UniqueName: \"kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.339214 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.339264 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz7cl\" (UniqueName: \"kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.340138 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.371918 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tm6z\" (UniqueName: \"kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z\") pod \"glance-db-create-mcpnx\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.374139 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j2b4b"] Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.385472 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7650-account-create-update-qkm7g"] Dec 05 21:00:16 crc kubenswrapper[4747]: W1205 21:00:16.392896 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5f0bf6d_ce1c_48c5_8866_a65a30022ca3.slice/crio-1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502 WatchSource:0}: Error finding container 1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502: Status 404 returned error can't find the container with id 1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502 Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.400706 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6w6g9"] Dec 05 21:00:16 crc kubenswrapper[4747]: W1205 21:00:16.439518 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9363606d_f1e1_4ba6_aca3_155bb6b57473.slice/crio-053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3 WatchSource:0}: Error finding container 053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3: Status 404 returned error can't find the container with id 053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3 Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.443852 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.446241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz7cl\" (UniqueName: \"kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.446280 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:16 crc kubenswrapper[4747]: E1205 21:00:16.446474 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:00:16 crc kubenswrapper[4747]: E1205 21:00:16.446489 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 21:00:16 crc kubenswrapper[4747]: E1205 21:00:16.446526 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift podName:5f06375c-a008-4d1f-b2ae-516549bcd438 nodeName:}" failed. No retries permitted until 2025-12-05 21:00:24.446512909 +0000 UTC m=+1094.913820397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift") pod "swift-storage-0" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438") : configmap "swift-ring-files" not found Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.444979 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.463926 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.464398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz7cl\" (UniqueName: \"kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl\") pod \"glance-e01d-account-create-update-jw9bz\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.582715 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.786281 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8da4-account-create-update-llwr4"] Dec 05 21:00:16 crc kubenswrapper[4747]: W1205 21:00:16.796195 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84eeb352_63cc_4fbf_b328_0911c3f67abf.slice/crio-71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820 WatchSource:0}: Error finding container 71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820: Status 404 returned error can't find the container with id 71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820 Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.950285 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:16 crc kubenswrapper[4747]: I1205 21:00:16.999049 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mcpnx"] Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.035688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j2b4b" event={"ID":"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2","Type":"ContainerStarted","Data":"944cd2f6d197a485132b4ea6e7af1c6dac4b71d8e0b0b803a63fad34e04abb6f"} Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.037103 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8da4-account-create-update-llwr4" event={"ID":"84eeb352-63cc-4fbf-b328-0911c3f67abf","Type":"ContainerStarted","Data":"71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820"} Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.039178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7650-account-create-update-qkm7g" event={"ID":"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3","Type":"ContainerStarted","Data":"1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502"} Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.039965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6w6g9" event={"ID":"9363606d-f1e1-4ba6-aca3-155bb6b57473","Type":"ContainerStarted","Data":"053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3"} Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.040652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mcpnx" event={"ID":"b68d9420-28ed-4cf0-a8c7-774600d5436e","Type":"ContainerStarted","Data":"039ef65e61e7246d1f4960346b30f41f95293e3a0ad5955a6524ce635cddda98"} Dec 05 21:00:17 crc kubenswrapper[4747]: I1205 21:00:17.102165 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e01d-account-create-update-jw9bz"] Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.014744 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.066938 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e01d-account-create-update-jw9bz" event={"ID":"7f2a59e8-2466-4c41-93c5-d88075dfbddf","Type":"ContainerStarted","Data":"861d5df6819e28e8834f8a2c19508f1baaf5973b994be45b6496544059217258"} Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.096072 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.096379 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="dnsmasq-dns" containerID="cri-o://fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476" gracePeriod=10 Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.833847 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.909287 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb\") pod \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.909369 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc\") pod \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.909439 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config\") pod \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.909560 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb\") pod \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.909621 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzzlr\" (UniqueName: \"kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr\") pod \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\" (UID: \"54d24deb-7268-4bb5-818e-8cf514cbe5b1\") " Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.923846 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr" (OuterVolumeSpecName: "kube-api-access-mzzlr") pod "54d24deb-7268-4bb5-818e-8cf514cbe5b1" (UID: "54d24deb-7268-4bb5-818e-8cf514cbe5b1"). InnerVolumeSpecName "kube-api-access-mzzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.947067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54d24deb-7268-4bb5-818e-8cf514cbe5b1" (UID: "54d24deb-7268-4bb5-818e-8cf514cbe5b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.952748 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54d24deb-7268-4bb5-818e-8cf514cbe5b1" (UID: "54d24deb-7268-4bb5-818e-8cf514cbe5b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.961545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config" (OuterVolumeSpecName: "config") pod "54d24deb-7268-4bb5-818e-8cf514cbe5b1" (UID: "54d24deb-7268-4bb5-818e-8cf514cbe5b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:18 crc kubenswrapper[4747]: I1205 21:00:18.964264 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54d24deb-7268-4bb5-818e-8cf514cbe5b1" (UID: "54d24deb-7268-4bb5-818e-8cf514cbe5b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.011638 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.011670 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.011685 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.011698 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzzlr\" (UniqueName: \"kubernetes.io/projected/54d24deb-7268-4bb5-818e-8cf514cbe5b1-kube-api-access-mzzlr\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.011711 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54d24deb-7268-4bb5-818e-8cf514cbe5b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.077473 4747 generic.go:334] "Generic (PLEG): container finished" podID="7f2a59e8-2466-4c41-93c5-d88075dfbddf" containerID="115068073f7e5ab9a48b744498913c1f4b59e9db0badfecc60d328bacfbf5cce" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.077538 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e01d-account-create-update-jw9bz" event={"ID":"7f2a59e8-2466-4c41-93c5-d88075dfbddf","Type":"ContainerDied","Data":"115068073f7e5ab9a48b744498913c1f4b59e9db0badfecc60d328bacfbf5cce"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.078887 4747 generic.go:334] "Generic (PLEG): container finished" podID="f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" containerID="22f61abdae53ffc00d95bab85332c82fb57990e50c5f80a9440f9d18e16c1e05" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.078976 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7650-account-create-update-qkm7g" event={"ID":"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3","Type":"ContainerDied","Data":"22f61abdae53ffc00d95bab85332c82fb57990e50c5f80a9440f9d18e16c1e05"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.080508 4747 generic.go:334] "Generic (PLEG): container finished" podID="84eeb352-63cc-4fbf-b328-0911c3f67abf" containerID="3969d964903947e872ffe0ce65d01ff896775d4ceda5953edf46cb6db636dad2" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.080554 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8da4-account-create-update-llwr4" event={"ID":"84eeb352-63cc-4fbf-b328-0911c3f67abf","Type":"ContainerDied","Data":"3969d964903947e872ffe0ce65d01ff896775d4ceda5953edf46cb6db636dad2"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.081957 4747 generic.go:334] "Generic (PLEG): container finished" podID="9363606d-f1e1-4ba6-aca3-155bb6b57473" containerID="685822c02c1e6e5a3b73c537d25e8ed5863590f2bb1545fc0be7b2bc201035b1" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.082006 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6w6g9" event={"ID":"9363606d-f1e1-4ba6-aca3-155bb6b57473","Type":"ContainerDied","Data":"685822c02c1e6e5a3b73c537d25e8ed5863590f2bb1545fc0be7b2bc201035b1"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.083360 4747 generic.go:334] "Generic (PLEG): container finished" podID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerID="fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.083399 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" event={"ID":"54d24deb-7268-4bb5-818e-8cf514cbe5b1","Type":"ContainerDied","Data":"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.083414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" event={"ID":"54d24deb-7268-4bb5-818e-8cf514cbe5b1","Type":"ContainerDied","Data":"7fd20db884d6d20840dbe627b71bd4c4b18f26b36ce48eb097197236f287687c"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.083429 4747 scope.go:117] "RemoveContainer" containerID="fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.083530 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-fdjgh" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.088513 4747 generic.go:334] "Generic (PLEG): container finished" podID="b68d9420-28ed-4cf0-a8c7-774600d5436e" containerID="96d9fe9346adceba236ef7d628d470fe843b2a68dfb35a74d51aca199aa5b567" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.088575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mcpnx" event={"ID":"b68d9420-28ed-4cf0-a8c7-774600d5436e","Type":"ContainerDied","Data":"96d9fe9346adceba236ef7d628d470fe843b2a68dfb35a74d51aca199aa5b567"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.090832 4747 generic.go:334] "Generic (PLEG): container finished" podID="46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" containerID="3e8f9888f877f5cb02adb80d4dc39e181f6555ae1d98bbb6455a49dab7ae1f6d" exitCode=0 Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.090869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j2b4b" event={"ID":"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2","Type":"ContainerDied","Data":"3e8f9888f877f5cb02adb80d4dc39e181f6555ae1d98bbb6455a49dab7ae1f6d"} Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.116026 4747 scope.go:117] "RemoveContainer" containerID="8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.149678 4747 scope.go:117] "RemoveContainer" containerID="fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476" Dec 05 21:00:19 crc kubenswrapper[4747]: E1205 21:00:19.150433 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476\": container with ID starting with fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476 not found: ID does not exist" containerID="fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.150474 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476"} err="failed to get container status \"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476\": rpc error: code = NotFound desc = could not find container \"fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476\": container with ID starting with fbe8d9b2e12a656fdf1e77379a4cf10ac4ae28c4f98de00f4fcb73ba1ed99476 not found: ID does not exist" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.150502 4747 scope.go:117] "RemoveContainer" containerID="8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b" Dec 05 21:00:19 crc kubenswrapper[4747]: E1205 21:00:19.150962 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b\": container with ID starting with 8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b not found: ID does not exist" containerID="8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.151007 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b"} err="failed to get container status \"8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b\": rpc error: code = NotFound desc = could not find container \"8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b\": container with ID starting with 8444f3eb98b20ac89c552bc1d8ed3e9fd5333f52557336fcbe7d37fe2218377b not found: ID does not exist" Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.194263 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.201333 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-fdjgh"] Dec 05 21:00:19 crc kubenswrapper[4747]: I1205 21:00:19.866384 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" path="/var/lib/kubelet/pods/54d24deb-7268-4bb5-818e-8cf514cbe5b1/volumes" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.515556 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.639984 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q5xf\" (UniqueName: \"kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf\") pod \"84eeb352-63cc-4fbf-b328-0911c3f67abf\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.640330 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts\") pod \"84eeb352-63cc-4fbf-b328-0911c3f67abf\" (UID: \"84eeb352-63cc-4fbf-b328-0911c3f67abf\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.640075 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.641252 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84eeb352-63cc-4fbf-b328-0911c3f67abf" (UID: "84eeb352-63cc-4fbf-b328-0911c3f67abf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.646089 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf" (OuterVolumeSpecName: "kube-api-access-2q5xf") pod "84eeb352-63cc-4fbf-b328-0911c3f67abf" (UID: "84eeb352-63cc-4fbf-b328-0911c3f67abf"). InnerVolumeSpecName "kube-api-access-2q5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.647787 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.688434 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.706687 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.714247 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tm6z\" (UniqueName: \"kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z\") pod \"b68d9420-28ed-4cf0-a8c7-774600d5436e\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742600 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfxgr\" (UniqueName: \"kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr\") pod \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742631 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts\") pod \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\" (UID: \"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742654 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts\") pod \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742789 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts\") pod \"b68d9420-28ed-4cf0-a8c7-774600d5436e\" (UID: \"b68d9420-28ed-4cf0-a8c7-774600d5436e\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.742883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42r8s\" (UniqueName: \"kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s\") pod \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\" (UID: \"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.743290 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84eeb352-63cc-4fbf-b328-0911c3f67abf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.743315 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q5xf\" (UniqueName: \"kubernetes.io/projected/84eeb352-63cc-4fbf-b328-0911c3f67abf-kube-api-access-2q5xf\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.743479 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" (UID: "46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.743538 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b68d9420-28ed-4cf0-a8c7-774600d5436e" (UID: "b68d9420-28ed-4cf0-a8c7-774600d5436e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.743764 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" (UID: "f5f0bf6d-ce1c-48c5-8866-a65a30022ca3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.746774 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s" (OuterVolumeSpecName: "kube-api-access-42r8s") pod "46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" (UID: "46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2"). InnerVolumeSpecName "kube-api-access-42r8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.746907 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr" (OuterVolumeSpecName: "kube-api-access-nfxgr") pod "f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" (UID: "f5f0bf6d-ce1c-48c5-8866-a65a30022ca3"). InnerVolumeSpecName "kube-api-access-nfxgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.747397 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z" (OuterVolumeSpecName: "kube-api-access-8tm6z") pod "b68d9420-28ed-4cf0-a8c7-774600d5436e" (UID: "b68d9420-28ed-4cf0-a8c7-774600d5436e"). InnerVolumeSpecName "kube-api-access-8tm6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.845289 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts\") pod \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.845488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts\") pod \"9363606d-f1e1-4ba6-aca3-155bb6b57473\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.845760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f2a59e8-2466-4c41-93c5-d88075dfbddf" (UID: "7f2a59e8-2466-4c41-93c5-d88075dfbddf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.845824 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjd4\" (UniqueName: \"kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4\") pod \"9363606d-f1e1-4ba6-aca3-155bb6b57473\" (UID: \"9363606d-f1e1-4ba6-aca3-155bb6b57473\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.845863 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz7cl\" (UniqueName: \"kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl\") pod \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\" (UID: \"7f2a59e8-2466-4c41-93c5-d88075dfbddf\") " Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846126 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9363606d-f1e1-4ba6-aca3-155bb6b57473" (UID: "9363606d-f1e1-4ba6-aca3-155bb6b57473"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846421 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tm6z\" (UniqueName: \"kubernetes.io/projected/b68d9420-28ed-4cf0-a8c7-774600d5436e-kube-api-access-8tm6z\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846451 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfxgr\" (UniqueName: \"kubernetes.io/projected/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-kube-api-access-nfxgr\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846469 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846484 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846496 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2a59e8-2466-4c41-93c5-d88075dfbddf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846509 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68d9420-28ed-4cf0-a8c7-774600d5436e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846521 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9363606d-f1e1-4ba6-aca3-155bb6b57473-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.846534 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42r8s\" (UniqueName: \"kubernetes.io/projected/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2-kube-api-access-42r8s\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.849375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl" (OuterVolumeSpecName: "kube-api-access-cz7cl") pod "7f2a59e8-2466-4c41-93c5-d88075dfbddf" (UID: "7f2a59e8-2466-4c41-93c5-d88075dfbddf"). InnerVolumeSpecName "kube-api-access-cz7cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.849540 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4" (OuterVolumeSpecName: "kube-api-access-9jjd4") pod "9363606d-f1e1-4ba6-aca3-155bb6b57473" (UID: "9363606d-f1e1-4ba6-aca3-155bb6b57473"). InnerVolumeSpecName "kube-api-access-9jjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.948802 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjd4\" (UniqueName: \"kubernetes.io/projected/9363606d-f1e1-4ba6-aca3-155bb6b57473-kube-api-access-9jjd4\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:20 crc kubenswrapper[4747]: I1205 21:00:20.948842 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz7cl\" (UniqueName: \"kubernetes.io/projected/7f2a59e8-2466-4c41-93c5-d88075dfbddf-kube-api-access-cz7cl\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.112508 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7650-account-create-update-qkm7g" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.112469 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7650-account-create-update-qkm7g" event={"ID":"f5f0bf6d-ce1c-48c5-8866-a65a30022ca3","Type":"ContainerDied","Data":"1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.112755 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1536f63450ae4da5cf4d66ddff5bf21decd686364807593bb384957a0a5c9502" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.114253 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6w6g9" event={"ID":"9363606d-f1e1-4ba6-aca3-155bb6b57473","Type":"ContainerDied","Data":"053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.114276 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053129ccdce3b7c922362930d985a4abba4eee91cbb94bb80a139857fee1fec3" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.114278 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6w6g9" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.116057 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mcpnx" event={"ID":"b68d9420-28ed-4cf0-a8c7-774600d5436e","Type":"ContainerDied","Data":"039ef65e61e7246d1f4960346b30f41f95293e3a0ad5955a6524ce635cddda98"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.116083 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="039ef65e61e7246d1f4960346b30f41f95293e3a0ad5955a6524ce635cddda98" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.116111 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mcpnx" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.117443 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j2b4b" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.117515 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j2b4b" event={"ID":"46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2","Type":"ContainerDied","Data":"944cd2f6d197a485132b4ea6e7af1c6dac4b71d8e0b0b803a63fad34e04abb6f"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.117563 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944cd2f6d197a485132b4ea6e7af1c6dac4b71d8e0b0b803a63fad34e04abb6f" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.119366 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e01d-account-create-update-jw9bz" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.119374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e01d-account-create-update-jw9bz" event={"ID":"7f2a59e8-2466-4c41-93c5-d88075dfbddf","Type":"ContainerDied","Data":"861d5df6819e28e8834f8a2c19508f1baaf5973b994be45b6496544059217258"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.119425 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="861d5df6819e28e8834f8a2c19508f1baaf5973b994be45b6496544059217258" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.120765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8da4-account-create-update-llwr4" event={"ID":"84eeb352-63cc-4fbf-b328-0911c3f67abf","Type":"ContainerDied","Data":"71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820"} Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.120796 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71641f02a168f506aa989cc4808c6f3a83f772c69e0089b1ba57fc8c118c1820" Dec 05 21:00:21 crc kubenswrapper[4747]: I1205 21:00:21.120824 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8da4-account-create-update-llwr4" Dec 05 21:00:22 crc kubenswrapper[4747]: I1205 21:00:22.131107 4747 generic.go:334] "Generic (PLEG): container finished" podID="749c2c93-a078-47ab-b6f9-673cef723e20" containerID="3e2e501bf6db7407c56844788f8bbd62d9e8a7a9e7605f4be76cc81c55d45534" exitCode=0 Dec 05 21:00:22 crc kubenswrapper[4747]: I1205 21:00:22.131181 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2ns9l" event={"ID":"749c2c93-a078-47ab-b6f9-673cef723e20","Type":"ContainerDied","Data":"3e2e501bf6db7407c56844788f8bbd62d9e8a7a9e7605f4be76cc81c55d45534"} Dec 05 21:00:22 crc kubenswrapper[4747]: I1205 21:00:22.530553 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.448000 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495269 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495476 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b67q\" (UniqueName: \"kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495547 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495717 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.495748 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf\") pod \"749c2c93-a078-47ab-b6f9-673cef723e20\" (UID: \"749c2c93-a078-47ab-b6f9-673cef723e20\") " Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.496426 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.496804 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.503243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q" (OuterVolumeSpecName: "kube-api-access-7b67q") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "kube-api-access-7b67q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.506232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.519301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts" (OuterVolumeSpecName: "scripts") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.521044 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.521531 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "749c2c93-a078-47ab-b6f9-673cef723e20" (UID: "749c2c93-a078-47ab-b6f9-673cef723e20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597803 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597899 4747 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597918 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b67q\" (UniqueName: \"kubernetes.io/projected/749c2c93-a078-47ab-b6f9-673cef723e20-kube-api-access-7b67q\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597937 4747 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/749c2c93-a078-47ab-b6f9-673cef723e20-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597953 4747 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597970 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/749c2c93-a078-47ab-b6f9-673cef723e20-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:23 crc kubenswrapper[4747]: I1205 21:00:23.597986 4747 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/749c2c93-a078-47ab-b6f9-673cef723e20-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.151826 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2ns9l" event={"ID":"749c2c93-a078-47ab-b6f9-673cef723e20","Type":"ContainerDied","Data":"9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852"} Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.152084 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc1e5f653402fb4251e26fb2e6845b4bf02bba118648f68a6000bc1fbab6852" Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.151986 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2ns9l" Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.512448 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.523658 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"swift-storage-0\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " pod="openstack/swift-storage-0" Dec 05 21:00:24 crc kubenswrapper[4747]: I1205 21:00:24.598867 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 21:00:25 crc kubenswrapper[4747]: I1205 21:00:25.186734 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.167274 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"d5868a55bac4c9ebdd99869e4b36d69926980b1cb161056b6e40c4973fc1bb3e"} Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477494 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-5jxck"] Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.477876 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2a59e8-2466-4c41-93c5-d88075dfbddf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477897 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2a59e8-2466-4c41-93c5-d88075dfbddf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.477929 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="init" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477937 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="init" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.477953 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="dnsmasq-dns" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477959 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="dnsmasq-dns" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.477965 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eeb352-63cc-4fbf-b328-0911c3f67abf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477972 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eeb352-63cc-4fbf-b328-0911c3f67abf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.477987 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.477993 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.478005 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68d9420-28ed-4cf0-a8c7-774600d5436e" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478010 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68d9420-28ed-4cf0-a8c7-774600d5436e" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.478020 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c2c93-a078-47ab-b6f9-673cef723e20" containerName="swift-ring-rebalance" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478026 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c2c93-a078-47ab-b6f9-673cef723e20" containerName="swift-ring-rebalance" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.478039 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9363606d-f1e1-4ba6-aca3-155bb6b57473" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478045 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9363606d-f1e1-4ba6-aca3-155bb6b57473" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: E1205 21:00:26.478059 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478064 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478209 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478218 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68d9420-28ed-4cf0-a8c7-774600d5436e" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478230 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="749c2c93-a078-47ab-b6f9-673cef723e20" containerName="swift-ring-rebalance" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478236 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eeb352-63cc-4fbf-b328-0911c3f67abf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478244 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2a59e8-2466-4c41-93c5-d88075dfbddf" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478252 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9363606d-f1e1-4ba6-aca3-155bb6b57473" containerName="mariadb-database-create" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478261 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" containerName="mariadb-account-create-update" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478272 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d24deb-7268-4bb5-818e-8cf514cbe5b1" containerName="dnsmasq-dns" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.478811 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.481670 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tcfb4" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.481737 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.492083 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5jxck"] Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.568574 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.568774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.568924 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.569019 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhhl\" (UniqueName: \"kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.670288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.670359 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.670413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.670447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhhl\" (UniqueName: \"kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.682312 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.682321 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.682360 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.690852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhhl\" (UniqueName: \"kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl\") pod \"glance-db-sync-5jxck\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:26 crc kubenswrapper[4747]: I1205 21:00:26.796638 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5jxck" Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.175045 4747 generic.go:334] "Generic (PLEG): container finished" podID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerID="c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f" exitCode=0 Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.175136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerDied","Data":"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f"} Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.177147 4747 generic.go:334] "Generic (PLEG): container finished" podID="d49bd09d-90af-4f00-a333-0e292c581525" containerID="2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3" exitCode=0 Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.177215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerDied","Data":"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3"} Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.187149 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0"} Dec 05 21:00:27 crc kubenswrapper[4747]: I1205 21:00:27.420518 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5jxck"] Dec 05 21:00:27 crc kubenswrapper[4747]: W1205 21:00:27.425236 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7264db97_0a43_4984_94a2_1805f3aec313.slice/crio-282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5 WatchSource:0}: Error finding container 282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5: Status 404 returned error can't find the container with id 282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5 Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.196880 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerStarted","Data":"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.197896 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.199227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5jxck" event={"ID":"7264db97-0a43-4984-94a2-1805f3aec313","Type":"ContainerStarted","Data":"282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.201882 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.201935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.201968 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.204076 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerStarted","Data":"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79"} Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.204349 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.226164 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.484054378 podStartE2EDuration="57.226142115s" podCreationTimestamp="2025-12-05 20:59:31 +0000 UTC" firstStartedPulling="2025-12-05 20:59:33.583419624 +0000 UTC m=+1044.050727112" lastFinishedPulling="2025-12-05 20:59:52.325507361 +0000 UTC m=+1062.792814849" observedRunningTime="2025-12-05 21:00:28.222487515 +0000 UTC m=+1098.689795003" watchObservedRunningTime="2025-12-05 21:00:28.226142115 +0000 UTC m=+1098.693449603" Dec 05 21:00:28 crc kubenswrapper[4747]: I1205 21:00:28.255127 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.274648776 podStartE2EDuration="57.255100015s" podCreationTimestamp="2025-12-05 20:59:31 +0000 UTC" firstStartedPulling="2025-12-05 20:59:33.333064324 +0000 UTC m=+1043.800371812" lastFinishedPulling="2025-12-05 20:59:52.313515563 +0000 UTC m=+1062.780823051" observedRunningTime="2025-12-05 21:00:28.245296771 +0000 UTC m=+1098.712604269" watchObservedRunningTime="2025-12-05 21:00:28.255100015 +0000 UTC m=+1098.722407543" Dec 05 21:00:31 crc kubenswrapper[4747]: I1205 21:00:31.158903 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:00:31 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 21:00:31 crc kubenswrapper[4747]: > Dec 05 21:00:31 crc kubenswrapper[4747]: I1205 21:00:31.159268 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:00:32 crc kubenswrapper[4747]: I1205 21:00:32.243880 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2"} Dec 05 21:00:32 crc kubenswrapper[4747]: I1205 21:00:32.244308 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b"} Dec 05 21:00:33 crc kubenswrapper[4747]: I1205 21:00:33.254790 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0"} Dec 05 21:00:33 crc kubenswrapper[4747]: I1205 21:00:33.255326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273"} Dec 05 21:00:33 crc kubenswrapper[4747]: I1205 21:00:33.945902 4747 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","poda040d587-9981-428f-baed-4ad3d3b2ee55"] err="unable to destroy cgroup paths for cgroup [kubepods burstable poda040d587-9981-428f-baed-4ad3d3b2ee55] : Timed out while waiting for systemd to remove kubepods-burstable-poda040d587_9981_428f_baed_4ad3d3b2ee55.slice" Dec 05 21:00:33 crc kubenswrapper[4747]: E1205 21:00:33.945976 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable poda040d587-9981-428f-baed-4ad3d3b2ee55] : unable to destroy cgroup paths for cgroup [kubepods burstable poda040d587-9981-428f-baed-4ad3d3b2ee55] : Timed out while waiting for systemd to remove kubepods-burstable-poda040d587_9981_428f_baed_4ad3d3b2ee55.slice" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" podUID="a040d587-9981-428f-baed-4ad3d3b2ee55" Dec 05 21:00:34 crc kubenswrapper[4747]: I1205 21:00:34.261753 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.162188 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.171575 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:00:36 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 21:00:36 crc kubenswrapper[4747]: > Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.222372 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.222432 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.381438 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ns2k6-config-jh272"] Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.383648 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.386789 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.404906 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ns2k6-config-jh272"] Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549603 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549646 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4j7\" (UniqueName: \"kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549696 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549719 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549775 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.549793 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651603 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651729 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651762 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4j7\" (UniqueName: \"kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.651925 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.652072 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.652093 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.652121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.652568 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.653909 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.670941 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4j7\" (UniqueName: \"kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7\") pod \"ovn-controller-ns2k6-config-jh272\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:36 crc kubenswrapper[4747]: I1205 21:00:36.704751 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:41 crc kubenswrapper[4747]: I1205 21:00:41.164263 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:00:41 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 21:00:41 crc kubenswrapper[4747]: > Dec 05 21:00:42 crc kubenswrapper[4747]: I1205 21:00:42.699766 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 21:00:43 crc kubenswrapper[4747]: I1205 21:00:43.025233 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.454377 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k58mr"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.456746 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.467130 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k58mr"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.557964 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sx5d9"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.559322 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.571941 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-067d-account-create-update-jhm5h"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.572936 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.574760 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.582620 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sx5d9"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.626277 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.626361 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzlxt\" (UniqueName: \"kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.630532 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-067d-account-create-update-jhm5h"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.671544 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cd71-account-create-update-f4fch"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.673228 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.675558 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.688725 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cd71-account-create-update-f4fch"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.731749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzlxt\" (UniqueName: \"kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.732156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htk6\" (UniqueName: \"kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.732269 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.754240 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6xg\" (UniqueName: \"kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.754372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.754404 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.755463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.783707 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzlxt\" (UniqueName: \"kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt\") pod \"cinder-db-create-k58mr\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.784370 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rhvjl"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.785458 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.796002 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rhvjl"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.828701 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tj4z5"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.829994 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.831890 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.832076 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-75t4l" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.832248 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.832675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.836372 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tj4z5"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htk6\" (UniqueName: \"kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856250 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856297 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856360 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6xg\" (UniqueName: \"kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.856512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcwlf\" (UniqueName: \"kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.857450 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.857864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.874404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htk6\" (UniqueName: \"kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6\") pod \"cinder-067d-account-create-update-jhm5h\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.876739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6xg\" (UniqueName: \"kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg\") pod \"barbican-db-create-sx5d9\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.887634 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957825 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4krb\" (UniqueName: \"kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957880 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxp9\" (UniqueName: \"kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957908 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcwlf\" (UniqueName: \"kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957941 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.957970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.958509 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.981294 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7680-account-create-update-2vfwq"] Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.982720 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.984030 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcwlf\" (UniqueName: \"kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf\") pod \"barbican-cd71-account-create-update-f4fch\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.984874 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 21:00:44 crc kubenswrapper[4747]: I1205 21:00:44.999607 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7680-account-create-update-2vfwq"] Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.059174 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.059217 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4krb\" (UniqueName: \"kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.059261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxp9\" (UniqueName: \"kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.059284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.059305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.060259 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.076147 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.076359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.077159 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.080111 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxp9\" (UniqueName: \"kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.081122 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data\") pod \"keystone-db-sync-tj4z5\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.084927 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4krb\" (UniqueName: \"kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb\") pod \"neutron-db-create-rhvjl\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.126500 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.158933 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.159983 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8b85\" (UniqueName: \"kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.160100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.173038 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.264387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.264444 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8b85\" (UniqueName: \"kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.265383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.293133 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8b85\" (UniqueName: \"kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85\") pod \"neutron-7680-account-create-update-2vfwq\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:45 crc kubenswrapper[4747]: I1205 21:00:45.349945 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:46 crc kubenswrapper[4747]: I1205 21:00:46.164930 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:00:46 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 21:00:46 crc kubenswrapper[4747]: > Dec 05 21:00:46 crc kubenswrapper[4747]: E1205 21:00:46.306890 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 05 21:00:46 crc kubenswrapper[4747]: E1205 21:00:46.307255 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5xhhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-5jxck_openstack(7264db97-0a43-4984-94a2-1805f3aec313): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 21:00:46 crc kubenswrapper[4747]: E1205 21:00:46.308470 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-5jxck" podUID="7264db97-0a43-4984-94a2-1805f3aec313" Dec 05 21:00:46 crc kubenswrapper[4747]: E1205 21:00:46.386882 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-5jxck" podUID="7264db97-0a43-4984-94a2-1805f3aec313" Dec 05 21:00:46 crc kubenswrapper[4747]: I1205 21:00:46.966284 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cd71-account-create-update-f4fch"] Dec 05 21:00:46 crc kubenswrapper[4747]: I1205 21:00:46.972618 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ns2k6-config-jh272"] Dec 05 21:00:46 crc kubenswrapper[4747]: W1205 21:00:46.978178 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90d5b5dc_0e02_4e8c_833b_9ee0c2fc1a92.slice/crio-c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0 WatchSource:0}: Error finding container c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0: Status 404 returned error can't find the container with id c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0 Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.098470 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rhvjl"] Dec 05 21:00:47 crc kubenswrapper[4747]: W1205 21:00:47.105268 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbafb6071_80cb_4a67_bb33_bc45f069937c.slice/crio-c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306 WatchSource:0}: Error finding container c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306: Status 404 returned error can't find the container with id c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306 Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.122939 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tj4z5"] Dec 05 21:00:47 crc kubenswrapper[4747]: W1205 21:00:47.132626 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod996767c6_2a35_4b93_a779_8803b52eff5c.slice/crio-1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1 WatchSource:0}: Error finding container 1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1: Status 404 returned error can't find the container with id 1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1 Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.187802 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sx5d9"] Dec 05 21:00:47 crc kubenswrapper[4747]: W1205 21:00:47.207786 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0ac9306_6af1_4745_ba77_046356cdd7fd.slice/crio-5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a WatchSource:0}: Error finding container 5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a: Status 404 returned error can't find the container with id 5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.208667 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-067d-account-create-update-jhm5h"] Dec 05 21:00:47 crc kubenswrapper[4747]: W1205 21:00:47.215498 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb214ab_734f_4671_9959_6f424a9f6ade.slice/crio-f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e WatchSource:0}: Error finding container f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e: Status 404 returned error can't find the container with id f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.220000 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k58mr"] Dec 05 21:00:47 crc kubenswrapper[4747]: W1205 21:00:47.229071 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281b36e4_63e9_473f_a0bf_0ad62a307577.slice/crio-e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f WatchSource:0}: Error finding container e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f: Status 404 returned error can't find the container with id e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.229517 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7680-account-create-update-2vfwq"] Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.403267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7680-account-create-update-2vfwq" event={"ID":"843a1ea2-9e25-4649-9375-9b41f492055c","Type":"ContainerStarted","Data":"e5fa8f14c317a4b0d9ffbe06b47e367dfd93abf3e4ba788023ecef39a10f6498"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.408488 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.408532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.408543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.410190 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhvjl" event={"ID":"bafb6071-80cb-4a67-bb33-bc45f069937c","Type":"ContainerStarted","Data":"9ba405dde9a6a2b26ae6aef73ad9fe3e643c2c781adfc8e4cf982e48219da231"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.410240 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhvjl" event={"ID":"bafb6071-80cb-4a67-bb33-bc45f069937c","Type":"ContainerStarted","Data":"c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.414487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-067d-account-create-update-jhm5h" event={"ID":"c0ac9306-6af1-4745-ba77-046356cdd7fd","Type":"ContainerStarted","Data":"5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.424785 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-rhvjl" podStartSLOduration=3.424768005 podStartE2EDuration="3.424768005s" podCreationTimestamp="2025-12-05 21:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:47.4225427 +0000 UTC m=+1117.889850188" watchObservedRunningTime="2025-12-05 21:00:47.424768005 +0000 UTC m=+1117.892075493" Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.425237 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd71-account-create-update-f4fch" event={"ID":"5da23b71-6d79-4345-87e9-3528332fe50d","Type":"ContainerStarted","Data":"2c616242a09a02b14af1a770510f774577f4438e787989439ca44def0684c811"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.425283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd71-account-create-update-f4fch" event={"ID":"5da23b71-6d79-4345-87e9-3528332fe50d","Type":"ContainerStarted","Data":"50db72a116b93407193e6f3a36b55e2b69e0922cac231cf43308da3b62a07820"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.430693 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6-config-jh272" event={"ID":"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92","Type":"ContainerStarted","Data":"c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.435127 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sx5d9" event={"ID":"4cb214ab-734f-4671-9959-6f424a9f6ade","Type":"ContainerStarted","Data":"f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.437989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tj4z5" event={"ID":"996767c6-2a35-4b93-a779-8803b52eff5c","Type":"ContainerStarted","Data":"1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.439118 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k58mr" event={"ID":"281b36e4-63e9-473f-a0bf-0ad62a307577","Type":"ContainerStarted","Data":"e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f"} Dec 05 21:00:47 crc kubenswrapper[4747]: I1205 21:00:47.448557 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-cd71-account-create-update-f4fch" podStartSLOduration=3.448538385 podStartE2EDuration="3.448538385s" podCreationTimestamp="2025-12-05 21:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:47.444247719 +0000 UTC m=+1117.911555207" watchObservedRunningTime="2025-12-05 21:00:47.448538385 +0000 UTC m=+1117.915845873" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.451046 4747 generic.go:334] "Generic (PLEG): container finished" podID="90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" containerID="6c0a4222132d0fff66a7dafaa465590cd04be9365f738b2b4a9e907cda75ff38" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.452812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6-config-jh272" event={"ID":"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92","Type":"ContainerDied","Data":"6c0a4222132d0fff66a7dafaa465590cd04be9365f738b2b4a9e907cda75ff38"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.455688 4747 generic.go:334] "Generic (PLEG): container finished" podID="4cb214ab-734f-4671-9959-6f424a9f6ade" containerID="fc1cfe778171e50c163dcbd11b2d658d1e40a3c9a296c976c35359ea06c5a66f" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.455757 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sx5d9" event={"ID":"4cb214ab-734f-4671-9959-6f424a9f6ade","Type":"ContainerDied","Data":"fc1cfe778171e50c163dcbd11b2d658d1e40a3c9a296c976c35359ea06c5a66f"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.457449 4747 generic.go:334] "Generic (PLEG): container finished" podID="281b36e4-63e9-473f-a0bf-0ad62a307577" containerID="d572e4ae1c2bdcc8633d6f7e2500cd6bc1b99b0d1f2ef1d0e1c9d681f18656f7" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.457492 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k58mr" event={"ID":"281b36e4-63e9-473f-a0bf-0ad62a307577","Type":"ContainerDied","Data":"d572e4ae1c2bdcc8633d6f7e2500cd6bc1b99b0d1f2ef1d0e1c9d681f18656f7"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.459011 4747 generic.go:334] "Generic (PLEG): container finished" podID="843a1ea2-9e25-4649-9375-9b41f492055c" containerID="8a9220e2f263274b5d6646c5ced8d29f995f12701ac056b9861cdd18a54f6738" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.459054 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7680-account-create-update-2vfwq" event={"ID":"843a1ea2-9e25-4649-9375-9b41f492055c","Type":"ContainerDied","Data":"8a9220e2f263274b5d6646c5ced8d29f995f12701ac056b9861cdd18a54f6738"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.466674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.466703 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.466715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.466726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerStarted","Data":"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.471064 4747 generic.go:334] "Generic (PLEG): container finished" podID="bafb6071-80cb-4a67-bb33-bc45f069937c" containerID="9ba405dde9a6a2b26ae6aef73ad9fe3e643c2c781adfc8e4cf982e48219da231" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.471207 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhvjl" event={"ID":"bafb6071-80cb-4a67-bb33-bc45f069937c","Type":"ContainerDied","Data":"9ba405dde9a6a2b26ae6aef73ad9fe3e643c2c781adfc8e4cf982e48219da231"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.473227 4747 generic.go:334] "Generic (PLEG): container finished" podID="c0ac9306-6af1-4745-ba77-046356cdd7fd" containerID="c6fde9291ef4b4350a70617293ff58ef6c3ec3ffff2a74a8701617ae895aa21e" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.473280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-067d-account-create-update-jhm5h" event={"ID":"c0ac9306-6af1-4745-ba77-046356cdd7fd","Type":"ContainerDied","Data":"c6fde9291ef4b4350a70617293ff58ef6c3ec3ffff2a74a8701617ae895aa21e"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.475687 4747 generic.go:334] "Generic (PLEG): container finished" podID="5da23b71-6d79-4345-87e9-3528332fe50d" containerID="2c616242a09a02b14af1a770510f774577f4438e787989439ca44def0684c811" exitCode=0 Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.475717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd71-account-create-update-f4fch" event={"ID":"5da23b71-6d79-4345-87e9-3528332fe50d","Type":"ContainerDied","Data":"2c616242a09a02b14af1a770510f774577f4438e787989439ca44def0684c811"} Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.550879 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.377181854 podStartE2EDuration="41.55082678s" podCreationTimestamp="2025-12-05 21:00:07 +0000 UTC" firstStartedPulling="2025-12-05 21:00:25.200271963 +0000 UTC m=+1095.667579461" lastFinishedPulling="2025-12-05 21:00:46.373916899 +0000 UTC m=+1116.841224387" observedRunningTime="2025-12-05 21:00:48.51701511 +0000 UTC m=+1118.984322648" watchObservedRunningTime="2025-12-05 21:00:48.55082678 +0000 UTC m=+1119.018134288" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.776765 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.778870 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.785626 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.790251 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856099 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856180 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856241 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856308 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856349 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.856388 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.957490 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.957568 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.957636 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.957927 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.957997 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.958037 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.958530 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.958657 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.958858 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.959385 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.959880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:48 crc kubenswrapper[4747]: I1205 21:00:48.984232 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn\") pod \"dnsmasq-dns-864b648dc7-wgsk2\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:49 crc kubenswrapper[4747]: I1205 21:00:49.098216 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:49 crc kubenswrapper[4747]: I1205 21:00:49.543459 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:00:51 crc kubenswrapper[4747]: I1205 21:00:51.162285 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ns2k6" Dec 05 21:00:53 crc kubenswrapper[4747]: W1205 21:00:53.965794 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf69412e_abae_40a4_b95c_24ea1b3a9cf5.slice/crio-469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1 WatchSource:0}: Error finding container 469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1: Status 404 returned error can't find the container with id 469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1 Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.211565 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.248929 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.257806 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.275257 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.280781 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4j7\" (UniqueName: \"kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.280856 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.280916 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4krb\" (UniqueName: \"kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb\") pod \"bafb6071-80cb-4a67-bb33-bc45f069937c\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.280968 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts\") pod \"bafb6071-80cb-4a67-bb33-bc45f069937c\" (UID: \"bafb6071-80cb-4a67-bb33-bc45f069937c\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281085 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx6xg\" (UniqueName: \"kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg\") pod \"4cb214ab-734f-4671-9959-6f424a9f6ade\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281126 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcwlf\" (UniqueName: \"kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf\") pod \"5da23b71-6d79-4345-87e9-3528332fe50d\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281173 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281206 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts\") pod \"5da23b71-6d79-4345-87e9-3528332fe50d\" (UID: \"5da23b71-6d79-4345-87e9-3528332fe50d\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281249 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281285 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts\") pod \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\" (UID: \"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.281321 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts\") pod \"4cb214ab-734f-4671-9959-6f424a9f6ade\" (UID: \"4cb214ab-734f-4671-9959-6f424a9f6ade\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.282469 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.282549 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cb214ab-734f-4671-9959-6f424a9f6ade" (UID: "4cb214ab-734f-4671-9959-6f424a9f6ade"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.282908 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5da23b71-6d79-4345-87e9-3528332fe50d" (UID: "5da23b71-6d79-4345-87e9-3528332fe50d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.282950 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.283003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bafb6071-80cb-4a67-bb33-bc45f069937c" (UID: "bafb6071-80cb-4a67-bb33-bc45f069937c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.283443 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run" (OuterVolumeSpecName: "var-run") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.283558 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.284002 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts" (OuterVolumeSpecName: "scripts") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.289859 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb" (OuterVolumeSpecName: "kube-api-access-g4krb") pod "bafb6071-80cb-4a67-bb33-bc45f069937c" (UID: "bafb6071-80cb-4a67-bb33-bc45f069937c"). InnerVolumeSpecName "kube-api-access-g4krb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.290348 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf" (OuterVolumeSpecName: "kube-api-access-pcwlf") pod "5da23b71-6d79-4345-87e9-3528332fe50d" (UID: "5da23b71-6d79-4345-87e9-3528332fe50d"). InnerVolumeSpecName "kube-api-access-pcwlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.299617 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg" (OuterVolumeSpecName: "kube-api-access-xx6xg") pod "4cb214ab-734f-4671-9959-6f424a9f6ade" (UID: "4cb214ab-734f-4671-9959-6f424a9f6ade"). InnerVolumeSpecName "kube-api-access-xx6xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.314562 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7" (OuterVolumeSpecName: "kube-api-access-6z4j7") pod "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" (UID: "90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92"). InnerVolumeSpecName "kube-api-access-6z4j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.344557 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.374925 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.380926 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382182 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts\") pod \"843a1ea2-9e25-4649-9375-9b41f492055c\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382210 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8b85\" (UniqueName: \"kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85\") pod \"843a1ea2-9e25-4649-9375-9b41f492055c\" (UID: \"843a1ea2-9e25-4649-9375-9b41f492055c\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382248 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts\") pod \"c0ac9306-6af1-4745-ba77-046356cdd7fd\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382285 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzlxt\" (UniqueName: \"kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt\") pod \"281b36e4-63e9-473f-a0bf-0ad62a307577\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382307 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts\") pod \"281b36e4-63e9-473f-a0bf-0ad62a307577\" (UID: \"281b36e4-63e9-473f-a0bf-0ad62a307577\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382325 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2htk6\" (UniqueName: \"kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6\") pod \"c0ac9306-6af1-4745-ba77-046356cdd7fd\" (UID: \"c0ac9306-6af1-4745-ba77-046356cdd7fd\") " Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382554 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4krb\" (UniqueName: \"kubernetes.io/projected/bafb6071-80cb-4a67-bb33-bc45f069937c-kube-api-access-g4krb\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382569 4747 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382682 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafb6071-80cb-4a67-bb33-bc45f069937c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382696 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx6xg\" (UniqueName: \"kubernetes.io/projected/4cb214ab-734f-4671-9959-6f424a9f6ade-kube-api-access-xx6xg\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382706 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcwlf\" (UniqueName: \"kubernetes.io/projected/5da23b71-6d79-4345-87e9-3528332fe50d-kube-api-access-pcwlf\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382717 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382726 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5da23b71-6d79-4345-87e9-3528332fe50d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382734 4747 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382743 4747 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382752 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb214ab-734f-4671-9959-6f424a9f6ade-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382761 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4j7\" (UniqueName: \"kubernetes.io/projected/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-kube-api-access-6z4j7\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382769 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382797 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "843a1ea2-9e25-4649-9375-9b41f492055c" (UID: "843a1ea2-9e25-4649-9375-9b41f492055c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.382845 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "281b36e4-63e9-473f-a0bf-0ad62a307577" (UID: "281b36e4-63e9-473f-a0bf-0ad62a307577"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.383309 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0ac9306-6af1-4745-ba77-046356cdd7fd" (UID: "c0ac9306-6af1-4745-ba77-046356cdd7fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.387232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt" (OuterVolumeSpecName: "kube-api-access-rzlxt") pod "281b36e4-63e9-473f-a0bf-0ad62a307577" (UID: "281b36e4-63e9-473f-a0bf-0ad62a307577"). InnerVolumeSpecName "kube-api-access-rzlxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.391903 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85" (OuterVolumeSpecName: "kube-api-access-t8b85") pod "843a1ea2-9e25-4649-9375-9b41f492055c" (UID: "843a1ea2-9e25-4649-9375-9b41f492055c"). InnerVolumeSpecName "kube-api-access-t8b85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.398706 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6" (OuterVolumeSpecName: "kube-api-access-2htk6") pod "c0ac9306-6af1-4745-ba77-046356cdd7fd" (UID: "c0ac9306-6af1-4745-ba77-046356cdd7fd"). InnerVolumeSpecName "kube-api-access-2htk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484795 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8b85\" (UniqueName: \"kubernetes.io/projected/843a1ea2-9e25-4649-9375-9b41f492055c-kube-api-access-t8b85\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484827 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ac9306-6af1-4745-ba77-046356cdd7fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484837 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzlxt\" (UniqueName: \"kubernetes.io/projected/281b36e4-63e9-473f-a0bf-0ad62a307577-kube-api-access-rzlxt\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484847 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/281b36e4-63e9-473f-a0bf-0ad62a307577-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484857 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2htk6\" (UniqueName: \"kubernetes.io/projected/c0ac9306-6af1-4745-ba77-046356cdd7fd-kube-api-access-2htk6\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.484865 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843a1ea2-9e25-4649-9375-9b41f492055c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.531047 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd71-account-create-update-f4fch" event={"ID":"5da23b71-6d79-4345-87e9-3528332fe50d","Type":"ContainerDied","Data":"50db72a116b93407193e6f3a36b55e2b69e0922cac231cf43308da3b62a07820"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.531095 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50db72a116b93407193e6f3a36b55e2b69e0922cac231cf43308da3b62a07820" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.531156 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd71-account-create-update-f4fch" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.541685 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k58mr" event={"ID":"281b36e4-63e9-473f-a0bf-0ad62a307577","Type":"ContainerDied","Data":"e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.541716 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e14a7124862d7a0d58b0b8d25e03431bfadb8836acba9c6fbfe2873e63fe3f" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.541764 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k58mr" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.544456 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7680-account-create-update-2vfwq" event={"ID":"843a1ea2-9e25-4649-9375-9b41f492055c","Type":"ContainerDied","Data":"e5fa8f14c317a4b0d9ffbe06b47e367dfd93abf3e4ba788023ecef39a10f6498"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.544500 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fa8f14c317a4b0d9ffbe06b47e367dfd93abf3e4ba788023ecef39a10f6498" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.544575 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7680-account-create-update-2vfwq" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.549472 4747 generic.go:334] "Generic (PLEG): container finished" podID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerID="c6d47481e8ef33252b8085dc229cc7562124000159fd1e4d87c701691429a481" exitCode=0 Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.549599 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" event={"ID":"af69412e-abae-40a4-b95c-24ea1b3a9cf5","Type":"ContainerDied","Data":"c6d47481e8ef33252b8085dc229cc7562124000159fd1e4d87c701691429a481"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.549640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" event={"ID":"af69412e-abae-40a4-b95c-24ea1b3a9cf5","Type":"ContainerStarted","Data":"469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.551431 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tj4z5" event={"ID":"996767c6-2a35-4b93-a779-8803b52eff5c","Type":"ContainerStarted","Data":"a362d9fb473592bb60d9373f2630d4ac929dccc8b2ff2dc397dd513565153b24"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.555562 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sx5d9" event={"ID":"4cb214ab-734f-4671-9959-6f424a9f6ade","Type":"ContainerDied","Data":"f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.555635 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8189aba2ca3ad5aa9915463ebc3eab2196d5843205965c18ebdb18087309b6e" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.555691 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sx5d9" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.557635 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhvjl" event={"ID":"bafb6071-80cb-4a67-bb33-bc45f069937c","Type":"ContainerDied","Data":"c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.557665 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b4c70c465e8cb905ac3265ff7c114a54eb341dfadb94bec4210808e85bf306" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.557717 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhvjl" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.559840 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-067d-account-create-update-jhm5h" event={"ID":"c0ac9306-6af1-4745-ba77-046356cdd7fd","Type":"ContainerDied","Data":"5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.559871 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9705188efb00c17d582a3478adc61383031f94810e05dc7b84b200ca89142a" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.559914 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-067d-account-create-update-jhm5h" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.565390 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6-config-jh272" event={"ID":"90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92","Type":"ContainerDied","Data":"c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0"} Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.565442 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c9a88c9776fada1e2b3c80d413275c02f485502117abbb3098d3ab6b24dcc0" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.565517 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6-config-jh272" Dec 05 21:00:54 crc kubenswrapper[4747]: I1205 21:00:54.604767 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tj4z5" podStartSLOduration=3.719347973 podStartE2EDuration="10.60472415s" podCreationTimestamp="2025-12-05 21:00:44 +0000 UTC" firstStartedPulling="2025-12-05 21:00:47.134894244 +0000 UTC m=+1117.602201732" lastFinishedPulling="2025-12-05 21:00:54.020270401 +0000 UTC m=+1124.487577909" observedRunningTime="2025-12-05 21:00:54.598850074 +0000 UTC m=+1125.066157562" watchObservedRunningTime="2025-12-05 21:00:54.60472415 +0000 UTC m=+1125.072031648" Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.419471 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ns2k6-config-jh272"] Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.432261 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ns2k6-config-jh272"] Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.575975 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" event={"ID":"af69412e-abae-40a4-b95c-24ea1b3a9cf5","Type":"ContainerStarted","Data":"6146e8593917a68bb0ab86fcf38800fdf5cda3e75eb80fe9775e293dc1aa808c"} Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.576047 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.593309 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" podStartSLOduration=7.59329191 podStartE2EDuration="7.59329191s" podCreationTimestamp="2025-12-05 21:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:00:55.591222018 +0000 UTC m=+1126.058529506" watchObservedRunningTime="2025-12-05 21:00:55.59329191 +0000 UTC m=+1126.060599398" Dec 05 21:00:55 crc kubenswrapper[4747]: I1205 21:00:55.861787 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" path="/var/lib/kubelet/pods/90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92/volumes" Dec 05 21:00:57 crc kubenswrapper[4747]: I1205 21:00:57.599973 4747 generic.go:334] "Generic (PLEG): container finished" podID="996767c6-2a35-4b93-a779-8803b52eff5c" containerID="a362d9fb473592bb60d9373f2630d4ac929dccc8b2ff2dc397dd513565153b24" exitCode=0 Dec 05 21:00:57 crc kubenswrapper[4747]: I1205 21:00:57.600058 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tj4z5" event={"ID":"996767c6-2a35-4b93-a779-8803b52eff5c","Type":"ContainerDied","Data":"a362d9fb473592bb60d9373f2630d4ac929dccc8b2ff2dc397dd513565153b24"} Dec 05 21:00:58 crc kubenswrapper[4747]: I1205 21:00:58.966751 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.099755 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.157996 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.158268 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="dnsmasq-dns" containerID="cri-o://b62eac0a04a7a445d0be766123c85952bbfb528b8bd03cb53d992450a3a8ac53" gracePeriod=10 Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.166936 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data\") pod \"996767c6-2a35-4b93-a779-8803b52eff5c\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.167023 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxp9\" (UniqueName: \"kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9\") pod \"996767c6-2a35-4b93-a779-8803b52eff5c\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.167048 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle\") pod \"996767c6-2a35-4b93-a779-8803b52eff5c\" (UID: \"996767c6-2a35-4b93-a779-8803b52eff5c\") " Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.200995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9" (OuterVolumeSpecName: "kube-api-access-pgxp9") pod "996767c6-2a35-4b93-a779-8803b52eff5c" (UID: "996767c6-2a35-4b93-a779-8803b52eff5c"). InnerVolumeSpecName "kube-api-access-pgxp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.222919 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "996767c6-2a35-4b93-a779-8803b52eff5c" (UID: "996767c6-2a35-4b93-a779-8803b52eff5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.233139 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data" (OuterVolumeSpecName: "config-data") pod "996767c6-2a35-4b93-a779-8803b52eff5c" (UID: "996767c6-2a35-4b93-a779-8803b52eff5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.268984 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.269017 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgxp9\" (UniqueName: \"kubernetes.io/projected/996767c6-2a35-4b93-a779-8803b52eff5c-kube-api-access-pgxp9\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.269028 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996767c6-2a35-4b93-a779-8803b52eff5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.643718 4747 generic.go:334] "Generic (PLEG): container finished" podID="500ba135-ea9d-42ba-adb5-487f1646368f" containerID="b62eac0a04a7a445d0be766123c85952bbfb528b8bd03cb53d992450a3a8ac53" exitCode=0 Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.644208 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" event={"ID":"500ba135-ea9d-42ba-adb5-487f1646368f","Type":"ContainerDied","Data":"b62eac0a04a7a445d0be766123c85952bbfb528b8bd03cb53d992450a3a8ac53"} Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.656092 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tj4z5" event={"ID":"996767c6-2a35-4b93-a779-8803b52eff5c","Type":"ContainerDied","Data":"1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1"} Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.656140 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7a0ca2c6c30038fe3b25f796cc040a53ea03884584b8783459e635679333f1" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.656211 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tj4z5" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876542 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qkm5w"] Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876826 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ac9306-6af1-4745-ba77-046356cdd7fd" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876838 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ac9306-6af1-4745-ba77-046356cdd7fd" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876853 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafb6071-80cb-4a67-bb33-bc45f069937c" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876859 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafb6071-80cb-4a67-bb33-bc45f069937c" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876868 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb214ab-734f-4671-9959-6f424a9f6ade" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876874 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb214ab-734f-4671-9959-6f424a9f6ade" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876890 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da23b71-6d79-4345-87e9-3528332fe50d" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876896 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da23b71-6d79-4345-87e9-3528332fe50d" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876910 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996767c6-2a35-4b93-a779-8803b52eff5c" containerName="keystone-db-sync" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876916 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="996767c6-2a35-4b93-a779-8803b52eff5c" containerName="keystone-db-sync" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876933 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281b36e4-63e9-473f-a0bf-0ad62a307577" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876938 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="281b36e4-63e9-473f-a0bf-0ad62a307577" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876948 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843a1ea2-9e25-4649-9375-9b41f492055c" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876953 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="843a1ea2-9e25-4649-9375-9b41f492055c" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: E1205 21:00:59.876965 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" containerName="ovn-config" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.876972 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" containerName="ovn-config" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877122 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ac9306-6af1-4745-ba77-046356cdd7fd" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877133 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="996767c6-2a35-4b93-a779-8803b52eff5c" containerName="keystone-db-sync" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877150 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="281b36e4-63e9-473f-a0bf-0ad62a307577" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877160 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="843a1ea2-9e25-4649-9375-9b41f492055c" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877173 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da23b71-6d79-4345-87e9-3528332fe50d" containerName="mariadb-account-create-update" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877183 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d5b5dc-0e02-4e8c-833b-9ee0c2fc1a92" containerName="ovn-config" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877192 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafb6071-80cb-4a67-bb33-bc45f069937c" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877201 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb214ab-734f-4671-9959-6f424a9f6ade" containerName="mariadb-database-create" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.877989 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qkm5w"] Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.878094 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.904503 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7qfnw"] Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.905677 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.909463 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.909875 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.910074 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-75t4l" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.910192 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.915922 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.933613 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7qfnw"] Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983514 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983874 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82q4\" (UniqueName: \"kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983896 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9zfx\" (UniqueName: \"kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983954 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.983972 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984001 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984152 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984170 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:00:59 crc kubenswrapper[4747]: I1205 21:00:59.984193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085130 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085186 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82q4\" (UniqueName: \"kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9zfx\" (UniqueName: \"kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085278 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085302 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085384 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085411 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085429 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.085458 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.086402 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.086433 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.086466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.086947 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.087603 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.091797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.095709 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.095938 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.097114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.100326 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.148916 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qkm5w"] Dec 05 21:01:00 crc kubenswrapper[4747]: E1205 21:01:00.149626 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-v82q4], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" podUID="699235a8-e8aa-49a9-a433-124efa944225" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.157383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82q4\" (UniqueName: \"kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4\") pod \"dnsmasq-dns-5678f567b5-qkm5w\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.168326 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9zfx\" (UniqueName: \"kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx\") pod \"keystone-bootstrap-7qfnw\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.173652 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rmcvl"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.175025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.185089 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.185378 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fck4g" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.189601 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ncncg"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190379 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190444 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190473 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190524 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190622 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtnr\" (UniqueName: \"kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190658 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.190950 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.192106 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.196083 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nklx8" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.198765 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.216429 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ncncg"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.249800 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rmcvl"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.268762 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.277331 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.278684 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.287323 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-x4x47"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.288697 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.291963 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292002 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292039 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtnr\" (UniqueName: \"kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292095 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6d2\" (UniqueName: \"kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292147 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292350 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292368 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.292417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.294205 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.296453 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.296787 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x4x47"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.296874 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz72d\" (UniqueName: \"kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.300461 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t544n" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.300540 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.300572 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.313462 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.325135 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.326960 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.345610 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.345990 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.359133 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtnr\" (UniqueName: \"kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr\") pod \"cinder-db-sync-rmcvl\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409211 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409270 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409299 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6cx\" (UniqueName: \"kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409328 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409487 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz72d\" (UniqueName: \"kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409536 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409597 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6d2\" (UniqueName: \"kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409621 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.409656 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.411536 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.412848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.412916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.413768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.414818 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.422778 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.436240 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.436809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.443209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.448845 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.448874 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.457076 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.458034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz72d\" (UniqueName: \"kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d\") pod \"dnsmasq-dns-74cd4f877c-wznpr\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.459117 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6d2\" (UniqueName: \"kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2\") pod \"barbican-db-sync-ncncg\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.481001 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.489873 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.502377 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fvckl"] Dec 05 21:01:00 crc kubenswrapper[4747]: E1205 21:01:00.502923 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="dnsmasq-dns" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.503014 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="dnsmasq-dns" Dec 05 21:01:00 crc kubenswrapper[4747]: E1205 21:01:00.503078 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="init" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.503139 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="init" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.503392 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" containerName="dnsmasq-dns" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.504151 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.514922 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.515095 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvckl"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.517478 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.517504 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vr8d8" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.521809 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb\") pod \"500ba135-ea9d-42ba-adb5-487f1646368f\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.524373 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config\") pod \"500ba135-ea9d-42ba-adb5-487f1646368f\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.524831 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.524927 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcw6l\" (UniqueName: \"kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525032 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525172 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525276 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525362 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6cx\" (UniqueName: \"kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525468 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525570 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525673 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525778 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525857 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.525956 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.526041 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8bvp\" (UniqueName: \"kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.526135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.526211 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.531153 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.532095 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.537127 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.537984 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.551733 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6cx\" (UniqueName: \"kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx\") pod \"placement-db-sync-x4x47\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.585711 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "500ba135-ea9d-42ba-adb5-487f1646368f" (UID: "500ba135-ea9d-42ba-adb5-487f1646368f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.586102 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.589226 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config" (OuterVolumeSpecName: "config") pod "500ba135-ea9d-42ba-adb5-487f1646368f" (UID: "500ba135-ea9d-42ba-adb5-487f1646368f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.616143 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633276 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrmlw\" (UniqueName: \"kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw\") pod \"500ba135-ea9d-42ba-adb5-487f1646368f\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633351 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb\") pod \"500ba135-ea9d-42ba-adb5-487f1646368f\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633461 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc\") pod \"500ba135-ea9d-42ba-adb5-487f1646368f\" (UID: \"500ba135-ea9d-42ba-adb5-487f1646368f\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633866 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633931 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633971 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.633995 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8bvp\" (UniqueName: \"kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634036 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634146 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcw6l\" (UniqueName: \"kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634177 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634202 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634273 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634371 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.634392 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.637388 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw" (OuterVolumeSpecName: "kube-api-access-nrmlw") pod "500ba135-ea9d-42ba-adb5-487f1646368f" (UID: "500ba135-ea9d-42ba-adb5-487f1646368f"). InnerVolumeSpecName "kube-api-access-nrmlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.640734 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.640744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.640760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.646251 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.646468 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.646785 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.646929 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.647115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.658853 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcw6l\" (UniqueName: \"kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l\") pod \"neutron-db-sync-fvckl\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.662803 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8bvp\" (UniqueName: \"kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp\") pod \"ceilometer-0\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.688948 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "500ba135-ea9d-42ba-adb5-487f1646368f" (UID: "500ba135-ea9d-42ba-adb5-487f1646368f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.692613 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" event={"ID":"500ba135-ea9d-42ba-adb5-487f1646368f","Type":"ContainerDied","Data":"710db5b63b4d3ef43e6c81d44b51ea939a0b48c9f2519e871f3119dc5f8a501a"} Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.692643 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-4bczm" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.692665 4747 scope.go:117] "RemoveContainer" containerID="b62eac0a04a7a445d0be766123c85952bbfb528b8bd03cb53d992450a3a8ac53" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.703919 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "500ba135-ea9d-42ba-adb5-487f1646368f" (UID: "500ba135-ea9d-42ba-adb5-487f1646368f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.707806 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.708229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5jxck" event={"ID":"7264db97-0a43-4984-94a2-1805f3aec313","Type":"ContainerStarted","Data":"970e2b8c6159ee24f6e1d7c7a385fd9c230a5faf9abb45c253b8c897d74ea335"} Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.747723 4747 scope.go:117] "RemoveContainer" containerID="b7b95c8125fcfad858bf386829e6afe787b7e305d060a024769623fea4c00b5e" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.749382 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrmlw\" (UniqueName: \"kubernetes.io/projected/500ba135-ea9d-42ba-adb5-487f1646368f-kube-api-access-nrmlw\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.749412 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.749426 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/500ba135-ea9d-42ba-adb5-487f1646368f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.761242 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.766415 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-5jxck" podStartSLOduration=2.7265546670000003 podStartE2EDuration="34.766392437s" podCreationTimestamp="2025-12-05 21:00:26 +0000 UTC" firstStartedPulling="2025-12-05 21:00:27.42976492 +0000 UTC m=+1097.897072408" lastFinishedPulling="2025-12-05 21:00:59.46960269 +0000 UTC m=+1129.936910178" observedRunningTime="2025-12-05 21:01:00.753687152 +0000 UTC m=+1131.220994640" watchObservedRunningTime="2025-12-05 21:01:00.766392437 +0000 UTC m=+1131.233699925" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.787279 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.824975 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.849144 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.849827 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v82q4\" (UniqueName: \"kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.849899 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.850039 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.850072 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.850090 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.850144 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config\") pod \"699235a8-e8aa-49a9-a433-124efa944225\" (UID: \"699235a8-e8aa-49a9-a433-124efa944225\") " Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.851374 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.852073 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config" (OuterVolumeSpecName: "config") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.852225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.852786 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.853031 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.858987 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4" (OuterVolumeSpecName: "kube-api-access-v82q4") pod "699235a8-e8aa-49a9-a433-124efa944225" (UID: "699235a8-e8aa-49a9-a433-124efa944225"). InnerVolumeSpecName "kube-api-access-v82q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.918909 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7qfnw"] Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.952973 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.953007 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.953019 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.953031 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.953045 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699235a8-e8aa-49a9-a433-124efa944225-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:00 crc kubenswrapper[4747]: I1205 21:01:00.953055 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v82q4\" (UniqueName: \"kubernetes.io/projected/699235a8-e8aa-49a9-a433-124efa944225-kube-api-access-v82q4\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.072124 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.082043 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-4bczm"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.091197 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.166094 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ncncg"] Dec 05 21:01:01 crc kubenswrapper[4747]: W1205 21:01:01.180694 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07b59f33_a929_4857_9d4d_d15a58667ba2.slice/crio-800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8 WatchSource:0}: Error finding container 800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8: Status 404 returned error can't find the container with id 800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8 Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.194573 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rmcvl"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.338319 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:01 crc kubenswrapper[4747]: W1205 21:01:01.354701 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8378d3_98bb_4713_8bc6_9527680b5b5e.slice/crio-8f37c8ab1af233a362ed1afc3b7d649d605d18c90b596edd18566586818a9e23 WatchSource:0}: Error finding container 8f37c8ab1af233a362ed1afc3b7d649d605d18c90b596edd18566586818a9e23: Status 404 returned error can't find the container with id 8f37c8ab1af233a362ed1afc3b7d649d605d18c90b596edd18566586818a9e23 Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.424568 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x4x47"] Dec 05 21:01:01 crc kubenswrapper[4747]: W1205 21:01:01.428214 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08375df_ebe2_42ed_a2ac_19c6365b9f87.slice/crio-c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c WatchSource:0}: Error finding container c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c: Status 404 returned error can't find the container with id c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.509464 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvckl"] Dec 05 21:01:01 crc kubenswrapper[4747]: W1205 21:01:01.529981 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3e26fb4_e5eb_4a54_abaa_b33101f25f61.slice/crio-628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197 WatchSource:0}: Error finding container 628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197: Status 404 returned error can't find the container with id 628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197 Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.718268 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvckl" event={"ID":"f3e26fb4-e5eb-4a54-abaa-b33101f25f61","Type":"ContainerStarted","Data":"5edad227b6c8fed3039bc0bf322c39145fe67edd7b89c67e006cbbd3ef396f71"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.718539 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvckl" event={"ID":"f3e26fb4-e5eb-4a54-abaa-b33101f25f61","Type":"ContainerStarted","Data":"628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.720497 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ncncg" event={"ID":"07b59f33-a929-4857-9d4d-d15a58667ba2","Type":"ContainerStarted","Data":"800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.725692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x4x47" event={"ID":"e08375df-ebe2-42ed-a2ac-19c6365b9f87","Type":"ContainerStarted","Data":"c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.730564 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerID="b8f28097f0504830cc1291babd60862d87ea1eabd52b29dc6474b4080c8c4480" exitCode=0 Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.730639 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" event={"ID":"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd","Type":"ContainerDied","Data":"b8f28097f0504830cc1291babd60862d87ea1eabd52b29dc6474b4080c8c4480"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.730663 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" event={"ID":"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd","Type":"ContainerStarted","Data":"e6d1b4dbe39b407287a37548543439ba028b5c4f965386970332d2e050fa8d44"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.732195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerStarted","Data":"8f37c8ab1af233a362ed1afc3b7d649d605d18c90b596edd18566586818a9e23"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.733536 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rmcvl" event={"ID":"24ec3037-277a-454c-b807-ceb5e626e724","Type":"ContainerStarted","Data":"3c1d282941cae44b2658300e4c1257f7c3f6d5cdcab1e62ef316f4879260c8c3"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.744623 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fvckl" podStartSLOduration=1.744602259 podStartE2EDuration="1.744602259s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:01.738133568 +0000 UTC m=+1132.205441066" watchObservedRunningTime="2025-12-05 21:01:01.744602259 +0000 UTC m=+1132.211909757" Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.745772 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7qfnw" event={"ID":"51211f5d-128b-44db-ab91-4def8f5cf426","Type":"ContainerStarted","Data":"86aab3d671c993418b87c4d45af8f7bf90b32d71794e98d95c6f839b0889ccf3"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.745810 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7qfnw" event={"ID":"51211f5d-128b-44db-ab91-4def8f5cf426","Type":"ContainerStarted","Data":"9476bbcf68cb41eb7690c287858f0e04c66f9a88adcce706e22c8e2ff1326ef4"} Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.745261 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qkm5w" Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.932024 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500ba135-ea9d-42ba-adb5-487f1646368f" path="/var/lib/kubelet/pods/500ba135-ea9d-42ba-adb5-487f1646368f/volumes" Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.933032 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qkm5w"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.933062 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qkm5w"] Dec 05 21:01:01 crc kubenswrapper[4747]: I1205 21:01:01.937122 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7qfnw" podStartSLOduration=2.937101371 podStartE2EDuration="2.937101371s" podCreationTimestamp="2025-12-05 21:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:01.825687673 +0000 UTC m=+1132.292995171" watchObservedRunningTime="2025-12-05 21:01:01.937101371 +0000 UTC m=+1132.404408849" Dec 05 21:01:02 crc kubenswrapper[4747]: I1205 21:01:02.543759 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:02 crc kubenswrapper[4747]: I1205 21:01:02.763108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" event={"ID":"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd","Type":"ContainerStarted","Data":"87416f48ac873d0bcb0bd97477550cbdcc3fa51bee5a65cf47096cc9f529f7e7"} Dec 05 21:01:02 crc kubenswrapper[4747]: I1205 21:01:02.763517 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:02 crc kubenswrapper[4747]: I1205 21:01:02.791680 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" podStartSLOduration=2.791663292 podStartE2EDuration="2.791663292s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:02.78634617 +0000 UTC m=+1133.253653658" watchObservedRunningTime="2025-12-05 21:01:02.791663292 +0000 UTC m=+1133.258970780" Dec 05 21:01:03 crc kubenswrapper[4747]: I1205 21:01:03.853563 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699235a8-e8aa-49a9-a433-124efa944225" path="/var/lib/kubelet/pods/699235a8-e8aa-49a9-a433-124efa944225/volumes" Dec 05 21:01:06 crc kubenswrapper[4747]: I1205 21:01:06.221642 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:01:06 crc kubenswrapper[4747]: I1205 21:01:06.222051 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:01:06 crc kubenswrapper[4747]: I1205 21:01:06.817929 4747 generic.go:334] "Generic (PLEG): container finished" podID="51211f5d-128b-44db-ab91-4def8f5cf426" containerID="86aab3d671c993418b87c4d45af8f7bf90b32d71794e98d95c6f839b0889ccf3" exitCode=0 Dec 05 21:01:06 crc kubenswrapper[4747]: I1205 21:01:06.818065 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7qfnw" event={"ID":"51211f5d-128b-44db-ab91-4def8f5cf426","Type":"ContainerDied","Data":"86aab3d671c993418b87c4d45af8f7bf90b32d71794e98d95c6f839b0889ccf3"} Dec 05 21:01:10 crc kubenswrapper[4747]: I1205 21:01:10.483556 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:01:10 crc kubenswrapper[4747]: I1205 21:01:10.555469 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:01:10 crc kubenswrapper[4747]: I1205 21:01:10.555769 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" containerID="cri-o://6146e8593917a68bb0ab86fcf38800fdf5cda3e75eb80fe9775e293dc1aa808c" gracePeriod=10 Dec 05 21:01:10 crc kubenswrapper[4747]: I1205 21:01:10.878231 4747 generic.go:334] "Generic (PLEG): container finished" podID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerID="6146e8593917a68bb0ab86fcf38800fdf5cda3e75eb80fe9775e293dc1aa808c" exitCode=0 Dec 05 21:01:10 crc kubenswrapper[4747]: I1205 21:01:10.878276 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" event={"ID":"af69412e-abae-40a4-b95c-24ea1b3a9cf5","Type":"ContainerDied","Data":"6146e8593917a68bb0ab86fcf38800fdf5cda3e75eb80fe9775e293dc1aa808c"} Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.293730 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.457776 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.457886 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.457920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9zfx\" (UniqueName: \"kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.457973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.458043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.458195 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data\") pod \"51211f5d-128b-44db-ab91-4def8f5cf426\" (UID: \"51211f5d-128b-44db-ab91-4def8f5cf426\") " Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.463876 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.469230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx" (OuterVolumeSpecName: "kube-api-access-n9zfx") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "kube-api-access-n9zfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.476554 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.478194 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts" (OuterVolumeSpecName: "scripts") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.494970 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.501776 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data" (OuterVolumeSpecName: "config-data") pod "51211f5d-128b-44db-ab91-4def8f5cf426" (UID: "51211f5d-128b-44db-ab91-4def8f5cf426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560322 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560365 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560375 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560388 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9zfx\" (UniqueName: \"kubernetes.io/projected/51211f5d-128b-44db-ab91-4def8f5cf426-kube-api-access-n9zfx\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560399 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.560408 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51211f5d-128b-44db-ab91-4def8f5cf426-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.897303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7qfnw" event={"ID":"51211f5d-128b-44db-ab91-4def8f5cf426","Type":"ContainerDied","Data":"9476bbcf68cb41eb7690c287858f0e04c66f9a88adcce706e22c8e2ff1326ef4"} Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.897803 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9476bbcf68cb41eb7690c287858f0e04c66f9a88adcce706e22c8e2ff1326ef4" Dec 05 21:01:12 crc kubenswrapper[4747]: I1205 21:01:12.897373 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7qfnw" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.406927 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7qfnw"] Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.413008 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7qfnw"] Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.478960 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b64kk"] Dec 05 21:01:13 crc kubenswrapper[4747]: E1205 21:01:13.479431 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51211f5d-128b-44db-ab91-4def8f5cf426" containerName="keystone-bootstrap" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.479449 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="51211f5d-128b-44db-ab91-4def8f5cf426" containerName="keystone-bootstrap" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.479681 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="51211f5d-128b-44db-ab91-4def8f5cf426" containerName="keystone-bootstrap" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.481116 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.486971 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.487053 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.487094 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.487763 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-75t4l" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.488240 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.489686 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b64kk"] Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580353 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbqmv\" (UniqueName: \"kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580459 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580614 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580792 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.580870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.682908 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqmv\" (UniqueName: \"kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.682968 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.683009 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.683051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.683102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.683141 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.689496 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.689637 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.689659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.690310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.697209 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.700942 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqmv\" (UniqueName: \"kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv\") pod \"keystone-bootstrap-b64kk\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.826637 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:13 crc kubenswrapper[4747]: I1205 21:01:13.855059 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51211f5d-128b-44db-ab91-4def8f5cf426" path="/var/lib/kubelet/pods/51211f5d-128b-44db-ab91-4def8f5cf426/volumes" Dec 05 21:01:14 crc kubenswrapper[4747]: I1205 21:01:14.099706 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Dec 05 21:01:19 crc kubenswrapper[4747]: I1205 21:01:19.098878 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Dec 05 21:01:19 crc kubenswrapper[4747]: E1205 21:01:19.867213 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f" Dec 05 21:01:19 crc kubenswrapper[4747]: E1205 21:01:19.867718 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cp6d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-ncncg_openstack(07b59f33-a929-4857-9d4d-d15a58667ba2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 21:01:19 crc kubenswrapper[4747]: E1205 21:01:19.869022 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-ncncg" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" Dec 05 21:01:19 crc kubenswrapper[4747]: E1205 21:01:19.962089 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f\\\"\"" pod="openstack/barbican-db-sync-ncncg" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" Dec 05 21:01:20 crc kubenswrapper[4747]: E1205 21:01:20.912907 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 05 21:01:20 crc kubenswrapper[4747]: E1205 21:01:20.913048 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdtnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rmcvl_openstack(24ec3037-277a-454c-b807-ceb5e626e724): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 21:01:20 crc kubenswrapper[4747]: E1205 21:01:20.914279 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rmcvl" podUID="24ec3037-277a-454c-b807-ceb5e626e724" Dec 05 21:01:20 crc kubenswrapper[4747]: I1205 21:01:20.966793 4747 generic.go:334] "Generic (PLEG): container finished" podID="7264db97-0a43-4984-94a2-1805f3aec313" containerID="970e2b8c6159ee24f6e1d7c7a385fd9c230a5faf9abb45c253b8c897d74ea335" exitCode=0 Dec 05 21:01:20 crc kubenswrapper[4747]: I1205 21:01:20.966894 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5jxck" event={"ID":"7264db97-0a43-4984-94a2-1805f3aec313","Type":"ContainerDied","Data":"970e2b8c6159ee24f6e1d7c7a385fd9c230a5faf9abb45c253b8c897d74ea335"} Dec 05 21:01:20 crc kubenswrapper[4747]: I1205 21:01:20.974721 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" event={"ID":"af69412e-abae-40a4-b95c-24ea1b3a9cf5","Type":"ContainerDied","Data":"469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1"} Dec 05 21:01:20 crc kubenswrapper[4747]: I1205 21:01:20.974786 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469d07b5c8268adb378bd34a3042f6f5e451a2d59435fc58c31bf5fbef9e7ae1" Dec 05 21:01:20 crc kubenswrapper[4747]: E1205 21:01:20.976434 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-rmcvl" podUID="24ec3037-277a-454c-b807-ceb5e626e724" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.176064 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325082 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.325745 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc\") pod \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\" (UID: \"af69412e-abae-40a4-b95c-24ea1b3a9cf5\") " Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.331096 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn" (OuterVolumeSpecName: "kube-api-access-mpttn") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "kube-api-access-mpttn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.356727 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b64kk"] Dec 05 21:01:21 crc kubenswrapper[4747]: W1205 21:01:21.367327 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbdb950_0501_41c3_a040_f8275f8c8d29.slice/crio-ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965 WatchSource:0}: Error finding container ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965: Status 404 returned error can't find the container with id ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965 Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.381838 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config" (OuterVolumeSpecName: "config") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.382292 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.396627 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.403306 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.417743 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af69412e-abae-40a4-b95c-24ea1b3a9cf5" (UID: "af69412e-abae-40a4-b95c-24ea1b3a9cf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427849 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427889 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427907 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427925 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpttn\" (UniqueName: \"kubernetes.io/projected/af69412e-abae-40a4-b95c-24ea1b3a9cf5-kube-api-access-mpttn\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427942 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.427954 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af69412e-abae-40a4-b95c-24ea1b3a9cf5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.985892 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b64kk" event={"ID":"4dbdb950-0501-41c3-a040-f8275f8c8d29","Type":"ContainerStarted","Data":"25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1"} Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.986223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b64kk" event={"ID":"4dbdb950-0501-41c3-a040-f8275f8c8d29","Type":"ContainerStarted","Data":"ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965"} Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.989714 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x4x47" event={"ID":"e08375df-ebe2-42ed-a2ac-19c6365b9f87","Type":"ContainerStarted","Data":"267eaf4271e17858acded1d97dd340e7340fdbc2118868acb48e221d78e8389e"} Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.991442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerStarted","Data":"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e"} Dec 05 21:01:21 crc kubenswrapper[4747]: I1205 21:01:21.991531 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-wgsk2" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.011076 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b64kk" podStartSLOduration=9.011054566 podStartE2EDuration="9.011054566s" podCreationTimestamp="2025-12-05 21:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:22.006207885 +0000 UTC m=+1152.473515383" watchObservedRunningTime="2025-12-05 21:01:22.011054566 +0000 UTC m=+1152.478362064" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.030484 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-x4x47" podStartSLOduration=2.545317132 podStartE2EDuration="22.030458388s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="2025-12-05 21:01:01.430063685 +0000 UTC m=+1131.897371173" lastFinishedPulling="2025-12-05 21:01:20.915204941 +0000 UTC m=+1151.382512429" observedRunningTime="2025-12-05 21:01:22.026425848 +0000 UTC m=+1152.493733346" watchObservedRunningTime="2025-12-05 21:01:22.030458388 +0000 UTC m=+1152.497765916" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.060856 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.070565 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-wgsk2"] Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.844568 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5jxck" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.954339 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle\") pod \"7264db97-0a43-4984-94a2-1805f3aec313\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.954434 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xhhl\" (UniqueName: \"kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl\") pod \"7264db97-0a43-4984-94a2-1805f3aec313\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.954515 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data\") pod \"7264db97-0a43-4984-94a2-1805f3aec313\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.954683 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data\") pod \"7264db97-0a43-4984-94a2-1805f3aec313\" (UID: \"7264db97-0a43-4984-94a2-1805f3aec313\") " Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.971827 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl" (OuterVolumeSpecName: "kube-api-access-5xhhl") pod "7264db97-0a43-4984-94a2-1805f3aec313" (UID: "7264db97-0a43-4984-94a2-1805f3aec313"). InnerVolumeSpecName "kube-api-access-5xhhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.972017 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7264db97-0a43-4984-94a2-1805f3aec313" (UID: "7264db97-0a43-4984-94a2-1805f3aec313"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:22 crc kubenswrapper[4747]: I1205 21:01:22.991665 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7264db97-0a43-4984-94a2-1805f3aec313" (UID: "7264db97-0a43-4984-94a2-1805f3aec313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.009173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerStarted","Data":"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c"} Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.012440 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5jxck" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.012697 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5jxck" event={"ID":"7264db97-0a43-4984-94a2-1805f3aec313","Type":"ContainerDied","Data":"282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5"} Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.012767 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="282b8e91c40bc1391af70c35b22b340d649344744cc49a3a75ba69e55a8a03b5" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.028988 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data" (OuterVolumeSpecName: "config-data") pod "7264db97-0a43-4984-94a2-1805f3aec313" (UID: "7264db97-0a43-4984-94a2-1805f3aec313"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.056387 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.056426 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xhhl\" (UniqueName: \"kubernetes.io/projected/7264db97-0a43-4984-94a2-1805f3aec313-kube-api-access-5xhhl\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.056440 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.056450 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7264db97-0a43-4984-94a2-1805f3aec313-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.402778 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:23 crc kubenswrapper[4747]: E1205 21:01:23.403166 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.403181 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" Dec 05 21:01:23 crc kubenswrapper[4747]: E1205 21:01:23.403206 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7264db97-0a43-4984-94a2-1805f3aec313" containerName="glance-db-sync" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.403213 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7264db97-0a43-4984-94a2-1805f3aec313" containerName="glance-db-sync" Dec 05 21:01:23 crc kubenswrapper[4747]: E1205 21:01:23.403226 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="init" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.403233 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="init" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.403434 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7264db97-0a43-4984-94a2-1805f3aec313" containerName="glance-db-sync" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.403456 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" containerName="dnsmasq-dns" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.404468 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.417306 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.565475 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.565671 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.565900 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.566048 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9j4\" (UniqueName: \"kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.566398 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.566472 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668238 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668293 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9j4\" (UniqueName: \"kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.668478 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.669396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.669669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.669802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.669945 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.671311 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.693233 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9j4\" (UniqueName: \"kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4\") pod \"dnsmasq-dns-74fd8b655f-6c54h\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.730189 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:23 crc kubenswrapper[4747]: I1205 21:01:23.855245 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af69412e-abae-40a4-b95c-24ea1b3a9cf5" path="/var/lib/kubelet/pods/af69412e-abae-40a4-b95c-24ea1b3a9cf5/volumes" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.201609 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:24 crc kubenswrapper[4747]: W1205 21:01:24.209334 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a9a5293_0f0b_43ea_823b_08e7e817434c.slice/crio-6af5e69a49204dedca4b38ece23a4b2bedf4a00fe05cad2a83bcedaa2bda2f17 WatchSource:0}: Error finding container 6af5e69a49204dedca4b38ece23a4b2bedf4a00fe05cad2a83bcedaa2bda2f17: Status 404 returned error can't find the container with id 6af5e69a49204dedca4b38ece23a4b2bedf4a00fe05cad2a83bcedaa2bda2f17 Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.312373 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.315937 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.317967 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.318315 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tcfb4" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.318458 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.322002 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.485810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.485903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.485961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67db\" (UniqueName: \"kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.486051 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.486122 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.486180 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.486282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.553901 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.555305 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.558410 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.562017 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.591139 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.591527 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p67db\" (UniqueName: \"kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.591697 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.592713 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.592803 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.592861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.592911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.591530 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.595086 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.595385 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.601895 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.602778 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.603451 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.611188 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67db\" (UniqueName: \"kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.626404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706722 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706770 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrv8\" (UniqueName: \"kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706869 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.706925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808714 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrv8\" (UniqueName: \"kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808765 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808793 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808826 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808868 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.808896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.809018 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.809498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.809853 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.814663 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.814668 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.824264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.836377 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrv8\" (UniqueName: \"kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.841997 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.860853 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:24 crc kubenswrapper[4747]: I1205 21:01:24.894056 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:25 crc kubenswrapper[4747]: I1205 21:01:25.037556 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" event={"ID":"5a9a5293-0f0b-43ea-823b-08e7e817434c","Type":"ContainerStarted","Data":"6af5e69a49204dedca4b38ece23a4b2bedf4a00fe05cad2a83bcedaa2bda2f17"} Dec 05 21:01:25 crc kubenswrapper[4747]: I1205 21:01:25.748986 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:25 crc kubenswrapper[4747]: I1205 21:01:25.851897 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:25 crc kubenswrapper[4747]: I1205 21:01:25.964733 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:26 crc kubenswrapper[4747]: I1205 21:01:26.050631 4747 generic.go:334] "Generic (PLEG): container finished" podID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerID="33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d" exitCode=0 Dec 05 21:01:26 crc kubenswrapper[4747]: I1205 21:01:26.050695 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" event={"ID":"5a9a5293-0f0b-43ea-823b-08e7e817434c","Type":"ContainerDied","Data":"33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d"} Dec 05 21:01:26 crc kubenswrapper[4747]: I1205 21:01:26.062392 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:26 crc kubenswrapper[4747]: E1205 21:01:26.481904 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbdb950_0501_41c3_a040_f8275f8c8d29.slice/crio-25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbdb950_0501_41c3_a040_f8275f8c8d29.slice/crio-conmon-25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:01:27 crc kubenswrapper[4747]: I1205 21:01:27.058972 4747 generic.go:334] "Generic (PLEG): container finished" podID="4dbdb950-0501-41c3-a040-f8275f8c8d29" containerID="25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1" exitCode=0 Dec 05 21:01:27 crc kubenswrapper[4747]: I1205 21:01:27.059149 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b64kk" event={"ID":"4dbdb950-0501-41c3-a040-f8275f8c8d29","Type":"ContainerDied","Data":"25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1"} Dec 05 21:01:27 crc kubenswrapper[4747]: I1205 21:01:27.061267 4747 generic.go:334] "Generic (PLEG): container finished" podID="e08375df-ebe2-42ed-a2ac-19c6365b9f87" containerID="267eaf4271e17858acded1d97dd340e7340fdbc2118868acb48e221d78e8389e" exitCode=0 Dec 05 21:01:27 crc kubenswrapper[4747]: I1205 21:01:27.061294 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x4x47" event={"ID":"e08375df-ebe2-42ed-a2ac-19c6365b9f87","Type":"ContainerDied","Data":"267eaf4271e17858acded1d97dd340e7340fdbc2118868acb48e221d78e8389e"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.092809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x4x47" event={"ID":"e08375df-ebe2-42ed-a2ac-19c6365b9f87","Type":"ContainerDied","Data":"c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.093424 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7449ada4470e35f75bf728c5d31038c546f4329eb4fcf002bfc59e574a8425c" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.094124 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerStarted","Data":"54ed20f2f40072c51e32d02d5a7d1f93e031ac8f5c6cfa8ce7185583fffd3899"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.095626 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerStarted","Data":"f21ad9f555bd0a5d69acd022234aa47740b4dec34eba9fd392810e44fca1ac04"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.097193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b64kk" event={"ID":"4dbdb950-0501-41c3-a040-f8275f8c8d29","Type":"ContainerDied","Data":"ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.097221 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae781210071d940663914535e1074769ead8b1c0451d3b997833df85aeab1965" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.098909 4747 generic.go:334] "Generic (PLEG): container finished" podID="f3e26fb4-e5eb-4a54-abaa-b33101f25f61" containerID="5edad227b6c8fed3039bc0bf322c39145fe67edd7b89c67e006cbbd3ef396f71" exitCode=0 Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.098971 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvckl" event={"ID":"f3e26fb4-e5eb-4a54-abaa-b33101f25f61","Type":"ContainerDied","Data":"5edad227b6c8fed3039bc0bf322c39145fe67edd7b89c67e006cbbd3ef396f71"} Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.214743 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.249473 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.331577 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts\") pod \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.331768 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data\") pod \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.332043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbqmv\" (UniqueName: \"kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.332103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f6cx\" (UniqueName: \"kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx\") pod \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.332140 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs\") pod \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.332186 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle\") pod \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\" (UID: \"e08375df-ebe2-42ed-a2ac-19c6365b9f87\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.332251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.336830 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs" (OuterVolumeSpecName: "logs") pod "e08375df-ebe2-42ed-a2ac-19c6365b9f87" (UID: "e08375df-ebe2-42ed-a2ac-19c6365b9f87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.338530 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts" (OuterVolumeSpecName: "scripts") pod "e08375df-ebe2-42ed-a2ac-19c6365b9f87" (UID: "e08375df-ebe2-42ed-a2ac-19c6365b9f87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.338692 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts" (OuterVolumeSpecName: "scripts") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.340551 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv" (OuterVolumeSpecName: "kube-api-access-hbqmv") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "kube-api-access-hbqmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.343857 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx" (OuterVolumeSpecName: "kube-api-access-9f6cx") pod "e08375df-ebe2-42ed-a2ac-19c6365b9f87" (UID: "e08375df-ebe2-42ed-a2ac-19c6365b9f87"). InnerVolumeSpecName "kube-api-access-9f6cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.365625 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data" (OuterVolumeSpecName: "config-data") pod "e08375df-ebe2-42ed-a2ac-19c6365b9f87" (UID: "e08375df-ebe2-42ed-a2ac-19c6365b9f87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.368808 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08375df-ebe2-42ed-a2ac-19c6365b9f87" (UID: "e08375df-ebe2-42ed-a2ac-19c6365b9f87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434093 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434149 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434188 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434218 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys\") pod \"4dbdb950-0501-41c3-a040-f8275f8c8d29\" (UID: \"4dbdb950-0501-41c3-a040-f8275f8c8d29\") " Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434545 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434559 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbqmv\" (UniqueName: \"kubernetes.io/projected/4dbdb950-0501-41c3-a040-f8275f8c8d29-kube-api-access-hbqmv\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434568 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f6cx\" (UniqueName: \"kubernetes.io/projected/e08375df-ebe2-42ed-a2ac-19c6365b9f87-kube-api-access-9f6cx\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434589 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08375df-ebe2-42ed-a2ac-19c6365b9f87-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434597 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434605 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.434613 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08375df-ebe2-42ed-a2ac-19c6365b9f87-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.438421 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.438453 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.459301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data" (OuterVolumeSpecName: "config-data") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.465039 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dbdb950-0501-41c3-a040-f8275f8c8d29" (UID: "4dbdb950-0501-41c3-a040-f8275f8c8d29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.535601 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.535639 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.535650 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:29 crc kubenswrapper[4747]: I1205 21:01:29.535658 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dbdb950-0501-41c3-a040-f8275f8c8d29-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.119131 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" event={"ID":"5a9a5293-0f0b-43ea-823b-08e7e817434c","Type":"ContainerStarted","Data":"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39"} Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.119440 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.127640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerStarted","Data":"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00"} Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.130334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerStarted","Data":"02b97a1906d9a5645412cd730a1ff13070429d0f5a6b98fab03814b0724053f1"} Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.132315 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b64kk" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.132976 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerStarted","Data":"7720c471d97208b26641982a8a4d3ba37f5ceee5e74759e9638e9ccf7d4a4420"} Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.133125 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x4x47" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.142904 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" podStartSLOduration=7.142874647 podStartE2EDuration="7.142874647s" podCreationTimestamp="2025-12-05 21:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:30.138546531 +0000 UTC m=+1160.605854029" watchObservedRunningTime="2025-12-05 21:01:30.142874647 +0000 UTC m=+1160.610182155" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.421877 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:01:30 crc kubenswrapper[4747]: E1205 21:01:30.422515 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08375df-ebe2-42ed-a2ac-19c6365b9f87" containerName="placement-db-sync" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.422533 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08375df-ebe2-42ed-a2ac-19c6365b9f87" containerName="placement-db-sync" Dec 05 21:01:30 crc kubenswrapper[4747]: E1205 21:01:30.422558 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbdb950-0501-41c3-a040-f8275f8c8d29" containerName="keystone-bootstrap" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.422565 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbdb950-0501-41c3-a040-f8275f8c8d29" containerName="keystone-bootstrap" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.422764 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbdb950-0501-41c3-a040-f8275f8c8d29" containerName="keystone-bootstrap" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.422791 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08375df-ebe2-42ed-a2ac-19c6365b9f87" containerName="placement-db-sync" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.423431 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.435885 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.437551 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.438704 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.438739 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.438781 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.438865 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.438953 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-75t4l" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.439235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.459120 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.459435 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.459648 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t544n" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.459858 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.460043 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.460186 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461112 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461186 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461226 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461250 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjnzm\" (UniqueName: \"kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461291 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461362 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461395 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461428 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461450 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461491 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461516 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461541 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgnx\" (UniqueName: \"kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461618 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.461648 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.476155 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562727 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562771 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562795 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjnzm\" (UniqueName: \"kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562839 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562866 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562907 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562930 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.562980 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.563001 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.563023 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgnx\" (UniqueName: \"kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.563054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.563070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.570033 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.572456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.573023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.575165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.579608 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.579662 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.579632 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.580911 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.580965 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.583042 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.584183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.584196 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.597128 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.600215 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgnx\" (UniqueName: \"kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx\") pod \"keystone-79977d44cb-v5vtt\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.602492 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjnzm\" (UniqueName: \"kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm\") pod \"placement-646c94b678-5975b\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.705383 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.757670 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.765508 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config\") pod \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.765559 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcw6l\" (UniqueName: \"kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l\") pod \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.765661 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle\") pod \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\" (UID: \"f3e26fb4-e5eb-4a54-abaa-b33101f25f61\") " Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.773172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.783772 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l" (OuterVolumeSpecName: "kube-api-access-zcw6l") pod "f3e26fb4-e5eb-4a54-abaa-b33101f25f61" (UID: "f3e26fb4-e5eb-4a54-abaa-b33101f25f61"). InnerVolumeSpecName "kube-api-access-zcw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.805418 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3e26fb4-e5eb-4a54-abaa-b33101f25f61" (UID: "f3e26fb4-e5eb-4a54-abaa-b33101f25f61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.812357 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config" (OuterVolumeSpecName: "config") pod "f3e26fb4-e5eb-4a54-abaa-b33101f25f61" (UID: "f3e26fb4-e5eb-4a54-abaa-b33101f25f61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.867886 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcw6l\" (UniqueName: \"kubernetes.io/projected/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-kube-api-access-zcw6l\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.867917 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:30 crc kubenswrapper[4747]: I1205 21:01:30.867933 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3e26fb4-e5eb-4a54-abaa-b33101f25f61-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.154642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerStarted","Data":"b2fb3ef458020ba4c30ea96d3669b81fe06a52f6c90e0b53621ddd409146b7e8"} Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.155056 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-log" containerID="cri-o://7720c471d97208b26641982a8a4d3ba37f5ceee5e74759e9638e9ccf7d4a4420" gracePeriod=30 Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.155594 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-httpd" containerID="cri-o://b2fb3ef458020ba4c30ea96d3669b81fe06a52f6c90e0b53621ddd409146b7e8" gracePeriod=30 Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.158612 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerStarted","Data":"01699af8c54f987c952dd84f81de28cb848d8c3c4bf27c490eb6157cd8a6dd43"} Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.158688 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-log" containerID="cri-o://02b97a1906d9a5645412cd730a1ff13070429d0f5a6b98fab03814b0724053f1" gracePeriod=30 Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.158746 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-httpd" containerID="cri-o://01699af8c54f987c952dd84f81de28cb848d8c3c4bf27c490eb6157cd8a6dd43" gracePeriod=30 Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.162224 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvckl" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.162296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvckl" event={"ID":"f3e26fb4-e5eb-4a54-abaa-b33101f25f61","Type":"ContainerDied","Data":"628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197"} Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.162315 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628ee021d59538c9c8af51a2f7fe4359eb54cfb9c2582d8d24ab8d336f438197" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.189136 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.18912169 podStartE2EDuration="8.18912169s" podCreationTimestamp="2025-12-05 21:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:31.179348027 +0000 UTC m=+1161.646655515" watchObservedRunningTime="2025-12-05 21:01:31.18912169 +0000 UTC m=+1161.656429178" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.219500 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.219474284 podStartE2EDuration="8.219474284s" podCreationTimestamp="2025-12-05 21:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:31.20723347 +0000 UTC m=+1161.674540958" watchObservedRunningTime="2025-12-05 21:01:31.219474284 +0000 UTC m=+1161.686781782" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.324050 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.346650 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:31 crc kubenswrapper[4747]: E1205 21:01:31.347156 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e26fb4-e5eb-4a54-abaa-b33101f25f61" containerName="neutron-db-sync" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.347173 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e26fb4-e5eb-4a54-abaa-b33101f25f61" containerName="neutron-db-sync" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.347371 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e26fb4-e5eb-4a54-abaa-b33101f25f61" containerName="neutron-db-sync" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.348766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.356520 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.364409 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.374784 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.386712 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.388492 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.393542 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.393890 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.394041 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.394165 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vr8d8" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.423219 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.482797 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.482916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmp4z\" (UniqueName: \"kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483029 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483073 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4f87\" (UniqueName: \"kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483168 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483199 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483314 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483439 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483543 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483569 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.483643 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586114 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586668 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586794 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586846 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmp4z\" (UniqueName: \"kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586918 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.586954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4f87\" (UniqueName: \"kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.587035 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.587091 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.587145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.587264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.589984 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.590182 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.591925 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.592420 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.593209 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.595007 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.595407 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.596073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.599003 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.607105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4f87\" (UniqueName: \"kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87\") pod \"neutron-5d57b6c6f6-xw99h\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.611852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmp4z\" (UniqueName: \"kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z\") pod \"dnsmasq-dns-849ff95dc5-vtz47\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.827685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:31 crc kubenswrapper[4747]: I1205 21:01:31.841281 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.170382 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerStarted","Data":"44bc0bd3c6f0a47cd14c827e342132010b773d5ce7c6430fcd671f4d1d9ea58a"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.170645 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerStarted","Data":"2007e611b3a8556fb4e9c90464ab69dd6ee17db9a61483225035a082c5539bb7"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.171763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79977d44cb-v5vtt" event={"ID":"8be50198-f9c5-4e90-bfa0-d33b502278b7","Type":"ContainerStarted","Data":"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.171783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79977d44cb-v5vtt" event={"ID":"8be50198-f9c5-4e90-bfa0-d33b502278b7","Type":"ContainerStarted","Data":"0a1d8bfdb33b3e004235e179e6fdb6dced1c5448f7a86096b352ca7094155c6a"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.172725 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.175391 4747 generic.go:334] "Generic (PLEG): container finished" podID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerID="b2fb3ef458020ba4c30ea96d3669b81fe06a52f6c90e0b53621ddd409146b7e8" exitCode=0 Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.175411 4747 generic.go:334] "Generic (PLEG): container finished" podID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerID="7720c471d97208b26641982a8a4d3ba37f5ceee5e74759e9638e9ccf7d4a4420" exitCode=143 Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.175452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerDied","Data":"b2fb3ef458020ba4c30ea96d3669b81fe06a52f6c90e0b53621ddd409146b7e8"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.175475 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerDied","Data":"7720c471d97208b26641982a8a4d3ba37f5ceee5e74759e9638e9ccf7d4a4420"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.178472 4747 generic.go:334] "Generic (PLEG): container finished" podID="88372402-4727-4c78-a3a9-f7d518868cc9" containerID="01699af8c54f987c952dd84f81de28cb848d8c3c4bf27c490eb6157cd8a6dd43" exitCode=0 Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.178498 4747 generic.go:334] "Generic (PLEG): container finished" podID="88372402-4727-4c78-a3a9-f7d518868cc9" containerID="02b97a1906d9a5645412cd730a1ff13070429d0f5a6b98fab03814b0724053f1" exitCode=143 Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.178600 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerDied","Data":"01699af8c54f987c952dd84f81de28cb848d8c3c4bf27c490eb6157cd8a6dd43"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.178627 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerDied","Data":"02b97a1906d9a5645412cd730a1ff13070429d0f5a6b98fab03814b0724053f1"} Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.178715 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="dnsmasq-dns" containerID="cri-o://1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39" gracePeriod=10 Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.212919 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79977d44cb-v5vtt" podStartSLOduration=2.212899834 podStartE2EDuration="2.212899834s" podCreationTimestamp="2025-12-05 21:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:32.188476517 +0000 UTC m=+1162.655784005" watchObservedRunningTime="2025-12-05 21:01:32.212899834 +0000 UTC m=+1162.680207322" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.446730 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.500812 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.588502 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.645868 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjrv8\" (UniqueName: \"kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.645965 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.645992 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.646112 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.646138 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.646151 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.646176 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data\") pod \"88372402-4727-4c78-a3a9-f7d518868cc9\" (UID: \"88372402-4727-4c78-a3a9-f7d518868cc9\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.648596 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.650948 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs" (OuterVolumeSpecName: "logs") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.658668 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8" (OuterVolumeSpecName: "kube-api-access-wjrv8") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "kube-api-access-wjrv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.671186 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.671887 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts" (OuterVolumeSpecName: "scripts") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.695199 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.735648 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.747507 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.747788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.747912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.747957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.748088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg9j4\" (UniqueName: \"kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.748277 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc\") pod \"5a9a5293-0f0b-43ea-823b-08e7e817434c\" (UID: \"5a9a5293-0f0b-43ea-823b-08e7e817434c\") " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.748811 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjrv8\" (UniqueName: \"kubernetes.io/projected/88372402-4727-4c78-a3a9-f7d518868cc9-kube-api-access-wjrv8\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.749130 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.749153 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.749164 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.749172 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.749181 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88372402-4727-4c78-a3a9-f7d518868cc9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.774122 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data" (OuterVolumeSpecName: "config-data") pod "88372402-4727-4c78-a3a9-f7d518868cc9" (UID: "88372402-4727-4c78-a3a9-f7d518868cc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.787376 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4" (OuterVolumeSpecName: "kube-api-access-gg9j4") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "kube-api-access-gg9j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.812008 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.826181 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.827702 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.850605 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.852519 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg9j4\" (UniqueName: \"kubernetes.io/projected/5a9a5293-0f0b-43ea-823b-08e7e817434c-kube-api-access-gg9j4\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.853079 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.853150 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.853213 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88372402-4727-4c78-a3a9-f7d518868cc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.855263 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.857473 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config" (OuterVolumeSpecName: "config") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.875972 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a9a5293-0f0b-43ea-823b-08e7e817434c" (UID: "5a9a5293-0f0b-43ea-823b-08e7e817434c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.955197 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.955229 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:32 crc kubenswrapper[4747]: I1205 21:01:32.955239 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9a5293-0f0b-43ea-823b-08e7e817434c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.189038 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88372402-4727-4c78-a3a9-f7d518868cc9","Type":"ContainerDied","Data":"54ed20f2f40072c51e32d02d5a7d1f93e031ac8f5c6cfa8ce7185583fffd3899"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.189099 4747 scope.go:117] "RemoveContainer" containerID="01699af8c54f987c952dd84f81de28cb848d8c3c4bf27c490eb6157cd8a6dd43" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.189099 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.190819 4747 generic.go:334] "Generic (PLEG): container finished" podID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerID="7ec2654bb9711a7c3fcacdb7600067676e28c64c0c70175cddc7b0d8f51aad16" exitCode=0 Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.190855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" event={"ID":"a4ed80ef-e29b-4aeb-b154-18989e8cbb86","Type":"ContainerDied","Data":"7ec2654bb9711a7c3fcacdb7600067676e28c64c0c70175cddc7b0d8f51aad16"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.190871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" event={"ID":"a4ed80ef-e29b-4aeb-b154-18989e8cbb86","Type":"ContainerStarted","Data":"e6b75b9c89211447332faf82ccd90a85e43584bd394d719256b90b9bc226c23a"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.198041 4747 generic.go:334] "Generic (PLEG): container finished" podID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerID="1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39" exitCode=0 Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.198107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" event={"ID":"5a9a5293-0f0b-43ea-823b-08e7e817434c","Type":"ContainerDied","Data":"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.198135 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" event={"ID":"5a9a5293-0f0b-43ea-823b-08e7e817434c","Type":"ContainerDied","Data":"6af5e69a49204dedca4b38ece23a4b2bedf4a00fe05cad2a83bcedaa2bda2f17"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.198204 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-6c54h" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.203074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ncncg" event={"ID":"07b59f33-a929-4857-9d4d-d15a58667ba2","Type":"ContainerStarted","Data":"c35bd7af758906f50a3ef95441de063d132d5f4fbd1e409678069723a5f290ad"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.204978 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerStarted","Data":"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.204999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerStarted","Data":"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.205008 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerStarted","Data":"800d6aa5407b264351815e04e7751f094f96b1e2f1ef764dddcb632a1d335700"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.205408 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.210328 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerStarted","Data":"450bad02541ca9f2045559c849365b5204f10379fb98f7f20ee63d82eaedc639"} Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.210750 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.210866 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-646c94b678-5975b" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.221907 4747 scope.go:117] "RemoveContainer" containerID="02b97a1906d9a5645412cd730a1ff13070429d0f5a6b98fab03814b0724053f1" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.239465 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d57b6c6f6-xw99h" podStartSLOduration=2.239446877 podStartE2EDuration="2.239446877s" podCreationTimestamp="2025-12-05 21:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:33.235978571 +0000 UTC m=+1163.703286079" watchObservedRunningTime="2025-12-05 21:01:33.239446877 +0000 UTC m=+1163.706754365" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.260206 4747 scope.go:117] "RemoveContainer" containerID="1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.267664 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ncncg" podStartSLOduration=2.090220236 podStartE2EDuration="33.267643638s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="2025-12-05 21:01:01.187087169 +0000 UTC m=+1131.654394657" lastFinishedPulling="2025-12-05 21:01:32.364510571 +0000 UTC m=+1162.831818059" observedRunningTime="2025-12-05 21:01:33.255354072 +0000 UTC m=+1163.722661580" watchObservedRunningTime="2025-12-05 21:01:33.267643638 +0000 UTC m=+1163.734951126" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.282069 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-646c94b678-5975b" podStartSLOduration=3.282047415 podStartE2EDuration="3.282047415s" podCreationTimestamp="2025-12-05 21:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:33.280850326 +0000 UTC m=+1163.748157834" watchObservedRunningTime="2025-12-05 21:01:33.282047415 +0000 UTC m=+1163.749354913" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.335987 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.347289 4747 scope.go:117] "RemoveContainer" containerID="33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.365931 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.400107 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.421327 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-6c54h"] Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.462713 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.463183 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="dnsmasq-dns" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463204 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="dnsmasq-dns" Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.463231 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="init" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463238 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="init" Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.463256 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-log" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463267 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-log" Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.463276 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-httpd" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463286 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-httpd" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463645 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-httpd" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463667 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" containerName="dnsmasq-dns" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.463679 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" containerName="glance-log" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.468451 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.472807 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.473019 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.473804 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.484123 4747 scope.go:117] "RemoveContainer" containerID="1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39" Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.485310 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39\": container with ID starting with 1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39 not found: ID does not exist" containerID="1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.485539 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39"} err="failed to get container status \"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39\": rpc error: code = NotFound desc = could not find container \"1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39\": container with ID starting with 1400c9ca9e8699c7631b63ea746b660171bff6a00ee4c1db61e1474f89ecae39 not found: ID does not exist" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.485567 4747 scope.go:117] "RemoveContainer" containerID="33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d" Dec 05 21:01:33 crc kubenswrapper[4747]: E1205 21:01:33.497648 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d\": container with ID starting with 33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d not found: ID does not exist" containerID="33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.497687 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d"} err="failed to get container status \"33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d\": rpc error: code = NotFound desc = could not find container \"33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d\": container with ID starting with 33ae11b91c010cdf022acbea3229aa07a69fd3291675a10ddd780efb5a99b78d not found: ID does not exist" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.570185 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.570400 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.570519 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.570636 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.570763 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.571101 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.571662 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.571860 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8mr\" (UniqueName: \"kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.678292 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681409 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681550 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681634 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681673 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681801 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8mr\" (UniqueName: \"kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681875 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.681899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.682333 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.682651 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.683179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.687195 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.687872 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.688766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.692310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.713966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8mr\" (UniqueName: \"kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.722785 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782636 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782730 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782757 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782786 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782816 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782904 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p67db\" (UniqueName: \"kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.782927 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data\") pod \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\" (UID: \"ec8ccaa3-51d7-49d1-871a-ca430d2b545f\") " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.783567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs" (OuterVolumeSpecName: "logs") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.783927 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.787410 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db" (OuterVolumeSpecName: "kube-api-access-p67db") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "kube-api-access-p67db". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.787563 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts" (OuterVolumeSpecName: "scripts") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.790002 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.815744 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.838222 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data" (OuterVolumeSpecName: "config-data") pod "ec8ccaa3-51d7-49d1-871a-ca430d2b545f" (UID: "ec8ccaa3-51d7-49d1-871a-ca430d2b545f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.849636 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9a5293-0f0b-43ea-823b-08e7e817434c" path="/var/lib/kubelet/pods/5a9a5293-0f0b-43ea-823b-08e7e817434c/volumes" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.850444 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88372402-4727-4c78-a3a9-f7d518868cc9" path="/var/lib/kubelet/pods/88372402-4727-4c78-a3a9-f7d518868cc9/volumes" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886725 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p67db\" (UniqueName: \"kubernetes.io/projected/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-kube-api-access-p67db\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886756 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886766 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886786 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886796 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886805 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.886814 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec8ccaa3-51d7-49d1-871a-ca430d2b545f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.913501 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.960484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:33 crc kubenswrapper[4747]: I1205 21:01:33.990641 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.222327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec8ccaa3-51d7-49d1-871a-ca430d2b545f","Type":"ContainerDied","Data":"f21ad9f555bd0a5d69acd022234aa47740b4dec34eba9fd392810e44fca1ac04"} Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.222668 4747 scope.go:117] "RemoveContainer" containerID="b2fb3ef458020ba4c30ea96d3669b81fe06a52f6c90e0b53621ddd409146b7e8" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.222937 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.249975 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.250471 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" event={"ID":"a4ed80ef-e29b-4aeb-b154-18989e8cbb86","Type":"ContainerStarted","Data":"55006e75d9192758affd5a99f2fe8b6d4ac1ab814eccf45fc1d2586a59637ccf"} Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.252777 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.265731 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.278597 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:34 crc kubenswrapper[4747]: E1205 21:01:34.278995 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-log" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.279011 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-log" Dec 05 21:01:34 crc kubenswrapper[4747]: E1205 21:01:34.279026 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-httpd" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.279032 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-httpd" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.279225 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-log" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.279239 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" containerName="glance-httpd" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.280337 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.281340 4747 scope.go:117] "RemoveContainer" containerID="7720c471d97208b26641982a8a4d3ba37f5ceee5e74759e9638e9ccf7d4a4420" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.282898 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.284970 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.296676 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.304864 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" podStartSLOduration=3.304849054 podStartE2EDuration="3.304849054s" podCreationTimestamp="2025-12-05 21:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:34.288452387 +0000 UTC m=+1164.755759875" watchObservedRunningTime="2025-12-05 21:01:34.304849054 +0000 UTC m=+1164.772156533" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399755 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5btx\" (UniqueName: \"kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399830 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399856 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399907 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.399984 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.400052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.400082 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501240 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501342 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501407 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501429 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501492 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5btx\" (UniqueName: \"kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501534 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501555 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.501604 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.502019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.508069 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.509071 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.511184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.511726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.513809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.521952 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5btx\" (UniqueName: \"kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.525538 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.553977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.656386 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.657939 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.659534 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.661169 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.667646 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.670453 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.818399 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvq5\" (UniqueName: \"kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.818452 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.818489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.819094 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.819186 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.819255 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.819371 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvq5\" (UniqueName: \"kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921676 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921722 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921804 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921837 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.921871 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.925848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.937761 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.937803 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.937964 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.940618 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.943929 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:34 crc kubenswrapper[4747]: I1205 21:01:34.954343 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvq5\" (UniqueName: \"kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5\") pod \"neutron-855575d477-mmmm7\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:35 crc kubenswrapper[4747]: I1205 21:01:35.045672 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:35 crc kubenswrapper[4747]: I1205 21:01:35.875430 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8ccaa3-51d7-49d1-871a-ca430d2b545f" path="/var/lib/kubelet/pods/ec8ccaa3-51d7-49d1-871a-ca430d2b545f/volumes" Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.221759 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.221820 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.221861 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.222511 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.222565 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1" gracePeriod=600 Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.295551 4747 generic.go:334] "Generic (PLEG): container finished" podID="07b59f33-a929-4857-9d4d-d15a58667ba2" containerID="c35bd7af758906f50a3ef95441de063d132d5f4fbd1e409678069723a5f290ad" exitCode=0 Dec 05 21:01:36 crc kubenswrapper[4747]: I1205 21:01:36.295723 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ncncg" event={"ID":"07b59f33-a929-4857-9d4d-d15a58667ba2","Type":"ContainerDied","Data":"c35bd7af758906f50a3ef95441de063d132d5f4fbd1e409678069723a5f290ad"} Dec 05 21:01:37 crc kubenswrapper[4747]: I1205 21:01:37.308983 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1" exitCode=0 Dec 05 21:01:37 crc kubenswrapper[4747]: I1205 21:01:37.309054 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1"} Dec 05 21:01:37 crc kubenswrapper[4747]: I1205 21:01:37.309336 4747 scope.go:117] "RemoveContainer" containerID="7f4bd613004b53750fa0f6da7c8719b897b6bb46a34063c5d37309224cc6de70" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.507193 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.687794 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle\") pod \"07b59f33-a929-4857-9d4d-d15a58667ba2\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.687912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data\") pod \"07b59f33-a929-4857-9d4d-d15a58667ba2\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.687965 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp6d2\" (UniqueName: \"kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2\") pod \"07b59f33-a929-4857-9d4d-d15a58667ba2\" (UID: \"07b59f33-a929-4857-9d4d-d15a58667ba2\") " Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.694063 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07b59f33-a929-4857-9d4d-d15a58667ba2" (UID: "07b59f33-a929-4857-9d4d-d15a58667ba2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.694280 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2" (OuterVolumeSpecName: "kube-api-access-cp6d2") pod "07b59f33-a929-4857-9d4d-d15a58667ba2" (UID: "07b59f33-a929-4857-9d4d-d15a58667ba2"). InnerVolumeSpecName "kube-api-access-cp6d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.722162 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07b59f33-a929-4857-9d4d-d15a58667ba2" (UID: "07b59f33-a929-4857-9d4d-d15a58667ba2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.790157 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.790195 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07b59f33-a929-4857-9d4d-d15a58667ba2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:38 crc kubenswrapper[4747]: I1205 21:01:38.790209 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp6d2\" (UniqueName: \"kubernetes.io/projected/07b59f33-a929-4857-9d4d-d15a58667ba2-kube-api-access-cp6d2\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.371526 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642"} Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.373995 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ncncg" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.373999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ncncg" event={"ID":"07b59f33-a929-4857-9d4d-d15a58667ba2","Type":"ContainerDied","Data":"800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8"} Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.374060 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800e1c99235c922a189cf12f43979100978d11516460d424c384fbb4bb3e52a8" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.377114 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerStarted","Data":"cf85c72eb41757b6fcde0d1faf5b7440eb7b79c0ecfdd77d182427a7f1cafd3c"} Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.587141 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.725811 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.803354 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:01:39 crc kubenswrapper[4747]: E1205 21:01:39.803793 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" containerName="barbican-db-sync" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.803805 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" containerName="barbican-db-sync" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.803990 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" containerName="barbican-db-sync" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.809245 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.824242 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nklx8" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.824472 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.824781 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.833479 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.833539 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bblx8\" (UniqueName: \"kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.847576 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.847857 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.847993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.886240 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.886291 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.900130 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.900173 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.900417 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="dnsmasq-dns" containerID="cri-o://55006e75d9192758affd5a99f2fe8b6d4ac1ab814eccf45fc1d2586a59637ccf" gracePeriod=10 Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.905634 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.910768 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.915952 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.950609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.950669 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bblx8\" (UniqueName: \"kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.950712 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.950789 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.950845 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.953426 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.954671 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.957124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.957180 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.957740 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.970313 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.977151 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:39 crc kubenswrapper[4747]: I1205 21:01:39.983922 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bblx8\" (UniqueName: \"kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8\") pod \"barbican-keystone-listener-5c486969dd-92bj4\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053198 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2pr\" (UniqueName: \"kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053354 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053375 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt49m\" (UniqueName: \"kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053454 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053507 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053527 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053632 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.053649 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.055394 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.055481 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.067014 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.075141 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.077498 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.107827 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157398 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157706 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157771 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnndr\" (UniqueName: \"kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157835 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157923 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157958 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.157993 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2pr\" (UniqueName: \"kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.158007 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.158023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.158041 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt49m\" (UniqueName: \"kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.159052 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.161629 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.162345 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.163305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.166691 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.167872 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.167888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.169229 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.169508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.189497 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.201088 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2pr\" (UniqueName: \"kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr\") pod \"barbican-worker-c8ddf7b47-hbwzt\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.202882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt49m\" (UniqueName: \"kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m\") pod \"dnsmasq-dns-65dd957765-c6rkl\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.260136 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnndr\" (UniqueName: \"kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.260226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.260261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.260444 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.261171 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.264448 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.264514 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.266173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.267942 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.268356 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.277086 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnndr\" (UniqueName: \"kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr\") pod \"barbican-api-64876474-nwc47\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.367217 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.389265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerStarted","Data":"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6"} Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.389461 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-central-agent" containerID="cri-o://5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e" gracePeriod=30 Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.389903 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.389925 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="proxy-httpd" containerID="cri-o://8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6" gracePeriod=30 Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.389975 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="sg-core" containerID="cri-o://178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00" gracePeriod=30 Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.390010 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-notification-agent" containerID="cri-o://08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c" gracePeriod=30 Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.396248 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerStarted","Data":"b290fe5e98fd92428554809605fd233802c8cb8813e0e46ca4e34212ca87b615"} Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.398541 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerStarted","Data":"9a5a1188d5e43c272ab75bc5698b72f8920a3117869d7317ef6a360745a4d24f"} Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.443443 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.695196492 podStartE2EDuration="40.443425914s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="2025-12-05 21:01:01.358318773 +0000 UTC m=+1131.825626261" lastFinishedPulling="2025-12-05 21:01:39.106548195 +0000 UTC m=+1169.573855683" observedRunningTime="2025-12-05 21:01:40.420954271 +0000 UTC m=+1170.888261759" watchObservedRunningTime="2025-12-05 21:01:40.443425914 +0000 UTC m=+1170.910733402" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.461800 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.726188 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:01:40 crc kubenswrapper[4747]: W1205 21:01:40.731375 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b073c9_c8ff_4de9_b2ee_605d37c94ffb.slice/crio-82a38a3e532801720fed70d38f5f7c1838153783bf5f0683b4cd4f511cebaabf WatchSource:0}: Error finding container 82a38a3e532801720fed70d38f5f7c1838153783bf5f0683b4cd4f511cebaabf: Status 404 returned error can't find the container with id 82a38a3e532801720fed70d38f5f7c1838153783bf5f0683b4cd4f511cebaabf Dec 05 21:01:40 crc kubenswrapper[4747]: I1205 21:01:40.803945 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:01:40 crc kubenswrapper[4747]: W1205 21:01:40.834311 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d350f3b_2497_4941_a006_84a503604020.slice/crio-562083f425c5155fb9a9e6c69bd422cff4b04b2901e08f1511753317903e550f WatchSource:0}: Error finding container 562083f425c5155fb9a9e6c69bd422cff4b04b2901e08f1511753317903e550f: Status 404 returned error can't find the container with id 562083f425c5155fb9a9e6c69bd422cff4b04b2901e08f1511753317903e550f Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.031283 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.141565 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.444479 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerStarted","Data":"e36ce53718babfbaadff85bd5ef1243205c808b63219a4bc8372e29dd1ab380b"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.491672 4747 generic.go:334] "Generic (PLEG): container finished" podID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerID="8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6" exitCode=0 Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.492019 4747 generic.go:334] "Generic (PLEG): container finished" podID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerID="178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00" exitCode=2 Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.492030 4747 generic.go:334] "Generic (PLEG): container finished" podID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerID="08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c" exitCode=0 Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.491722 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerDied","Data":"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.492136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerDied","Data":"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.492151 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerDied","Data":"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.535815 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerStarted","Data":"82a38a3e532801720fed70d38f5f7c1838153783bf5f0683b4cd4f511cebaabf"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.548883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" event={"ID":"617e0144-2c3d-4b9d-9fef-d976b607b1cc","Type":"ContainerStarted","Data":"19d820c6974c081c4241ba50aac020db3334cfa3d1f7378efb90dfc1bde3fe82"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.561133 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerStarted","Data":"994581ed5cbd13efaeffcc2fe4f3895016f0f787129e307baeaabde214fe2fb0"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.574032 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerStarted","Data":"dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.579457 4747 generic.go:334] "Generic (PLEG): container finished" podID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerID="55006e75d9192758affd5a99f2fe8b6d4ac1ab814eccf45fc1d2586a59637ccf" exitCode=0 Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.579535 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" event={"ID":"a4ed80ef-e29b-4aeb-b154-18989e8cbb86","Type":"ContainerDied","Data":"55006e75d9192758affd5a99f2fe8b6d4ac1ab814eccf45fc1d2586a59637ccf"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.581144 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerStarted","Data":"562083f425c5155fb9a9e6c69bd422cff4b04b2901e08f1511753317903e550f"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.584544 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rmcvl" event={"ID":"24ec3037-277a-454c-b807-ceb5e626e724","Type":"ContainerStarted","Data":"579b667a2f6822473f1c5251f59efb929a78e15bbb88f0ffcdc18baa8dad0012"} Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.610283 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rmcvl" podStartSLOduration=3.7164886 podStartE2EDuration="41.610266207s" podCreationTimestamp="2025-12-05 21:01:00 +0000 UTC" firstStartedPulling="2025-12-05 21:01:01.209190168 +0000 UTC m=+1131.676497656" lastFinishedPulling="2025-12-05 21:01:39.102967755 +0000 UTC m=+1169.570275263" observedRunningTime="2025-12-05 21:01:41.607816205 +0000 UTC m=+1172.075123693" watchObservedRunningTime="2025-12-05 21:01:41.610266207 +0000 UTC m=+1172.077573695" Dec 05 21:01:41 crc kubenswrapper[4747]: I1205 21:01:41.834057 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.471993 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.593627 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659628 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659662 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659773 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659799 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659850 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8bvp\" (UniqueName: \"kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659881 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659959 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmp4z\" (UniqueName: \"kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.659987 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.660014 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.660033 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts\") pod \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\" (UID: \"6a8378d3-98bb-4713-8bc6-9527680b5b5e\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.660055 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0\") pod \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\" (UID: \"a4ed80ef-e29b-4aeb-b154-18989e8cbb86\") " Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.662899 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" event={"ID":"a4ed80ef-e29b-4aeb-b154-18989e8cbb86","Type":"ContainerDied","Data":"e6b75b9c89211447332faf82ccd90a85e43584bd394d719256b90b9bc226c23a"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.663011 4747 scope.go:117] "RemoveContainer" containerID="55006e75d9192758affd5a99f2fe8b6d4ac1ab814eccf45fc1d2586a59637ccf" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.663210 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-vtz47" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.666134 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.666529 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.687768 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts" (OuterVolumeSpecName: "scripts") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.689837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp" (OuterVolumeSpecName: "kube-api-access-r8bvp") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "kube-api-access-r8bvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.690916 4747 generic.go:334] "Generic (PLEG): container finished" podID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerID="5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e" exitCode=0 Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.690975 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerDied","Data":"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.691050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a8378d3-98bb-4713-8bc6-9527680b5b5e","Type":"ContainerDied","Data":"8f37c8ab1af233a362ed1afc3b7d649d605d18c90b596edd18566586818a9e23"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.691129 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.695280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerStarted","Data":"5ff576908ae6828159afb6725bcd8f5262dbf45a7c1d981f6bf781737d654133"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.695318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerStarted","Data":"0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.695846 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.697202 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z" (OuterVolumeSpecName: "kube-api-access-vmp4z") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "kube-api-access-vmp4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.723926 4747 generic.go:334] "Generic (PLEG): container finished" podID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerID="0ea48b1283cad72e00dc77a8390ee5a66b0c45bfc28e53a617d6cc8c8642958a" exitCode=0 Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.724041 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" event={"ID":"617e0144-2c3d-4b9d-9fef-d976b607b1cc","Type":"ContainerDied","Data":"0ea48b1283cad72e00dc77a8390ee5a66b0c45bfc28e53a617d6cc8c8642958a"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.727992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerStarted","Data":"79c9846b3460b979f45bb6dd58d4e45c8f431c03eb72863e97310a16cb1f51bc"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.728042 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerStarted","Data":"48d16915b14bf258e2e2ea0934a538dc18c8aaa5a1ad5fd7f78335e7afa917f1"} Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.728146 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.728176 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.728379 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.732741 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-855575d477-mmmm7" podStartSLOduration=8.732725217 podStartE2EDuration="8.732725217s" podCreationTimestamp="2025-12-05 21:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:42.72885719 +0000 UTC m=+1173.196164688" watchObservedRunningTime="2025-12-05 21:01:42.732725217 +0000 UTC m=+1173.200032705" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.753159 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763074 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763100 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8bvp\" (UniqueName: \"kubernetes.io/projected/6a8378d3-98bb-4713-8bc6-9527680b5b5e-kube-api-access-r8bvp\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763109 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmp4z\" (UniqueName: \"kubernetes.io/projected/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-kube-api-access-vmp4z\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763118 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763126 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763134 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a8378d3-98bb-4713-8bc6-9527680b5b5e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.763142 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.795571 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64876474-nwc47" podStartSLOduration=2.795552953 podStartE2EDuration="2.795552953s" podCreationTimestamp="2025-12-05 21:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:42.784965397 +0000 UTC m=+1173.252272905" watchObservedRunningTime="2025-12-05 21:01:42.795552953 +0000 UTC m=+1173.262860441" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.805227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.855412 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.855470 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config" (OuterVolumeSpecName: "config") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.866989 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.867034 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.867044 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.877615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4ed80ef-e29b-4aeb-b154-18989e8cbb86" (UID: "a4ed80ef-e29b-4aeb-b154-18989e8cbb86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.900392 4747 scope.go:117] "RemoveContainer" containerID="7ec2654bb9711a7c3fcacdb7600067676e28c64c0c70175cddc7b0d8f51aad16" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.927131 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.938812 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data" (OuterVolumeSpecName: "config-data") pod "6a8378d3-98bb-4713-8bc6-9527680b5b5e" (UID: "6a8378d3-98bb-4713-8bc6-9527680b5b5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.960840 4747 scope.go:117] "RemoveContainer" containerID="8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.969066 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.969104 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ed80ef-e29b-4aeb-b154-18989e8cbb86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:42 crc kubenswrapper[4747]: I1205 21:01:42.969117 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8378d3-98bb-4713-8bc6-9527680b5b5e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.060493 4747 scope.go:117] "RemoveContainer" containerID="178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.072067 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.095007 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-vtz47"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.113836 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.139758 4747 scope.go:117] "RemoveContainer" containerID="08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.149731 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166056 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166629 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-central-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166647 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-central-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166656 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="sg-core" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166671 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="sg-core" Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166691 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="init" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166698 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="init" Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166714 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="dnsmasq-dns" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166720 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="dnsmasq-dns" Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166738 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="proxy-httpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166746 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="proxy-httpd" Dec 05 21:01:43 crc kubenswrapper[4747]: E1205 21:01:43.166755 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-notification-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166761 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-notification-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.166987 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-central-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.167005 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="proxy-httpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.167016 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="sg-core" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.167066 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" containerName="ceilometer-notification-agent" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.167080 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" containerName="dnsmasq-dns" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.169089 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.173720 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.173961 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.198074 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321623 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321732 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrcj\" (UniqueName: \"kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321807 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321825 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.321858 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.344221 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.355157 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.359866 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.364444 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.377045 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.425679 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.425926 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.425946 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.425984 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.426017 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.426031 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.426090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrcj\" (UniqueName: \"kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.426903 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.428994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.432908 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.437618 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.439333 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.440765 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.452961 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrcj\" (UniqueName: \"kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj\") pod \"ceilometer-0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.494017 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.527725 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.527775 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.527826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8g9p\" (UniqueName: \"kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.527928 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.527955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.528007 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.528033 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629275 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629293 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8g9p\" (UniqueName: \"kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629389 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.629415 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.630698 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.634276 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.634793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.635100 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.634244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.651190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8g9p\" (UniqueName: \"kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.655199 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle\") pod \"barbican-api-76c7dd97d4-lfdpd\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.686076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.746856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" event={"ID":"617e0144-2c3d-4b9d-9fef-d976b607b1cc","Type":"ContainerStarted","Data":"df61ca7201a9de81e0bc350788b8b187ab8d6915dc828fd7135839c8a8fb466e"} Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.748047 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.750170 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerStarted","Data":"6614e83c8dde1b5b152c507e4d21e58c0577b0a8e617f8e329992d641316a58b"} Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.754760 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerStarted","Data":"7fb0031e94f15d7b0ab4d4136633b1face192cdaefa05d39733629c84dcc5f4c"} Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.765970 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" podStartSLOduration=4.765951028 podStartE2EDuration="4.765951028s" podCreationTimestamp="2025-12-05 21:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:43.764964063 +0000 UTC m=+1174.232271551" watchObservedRunningTime="2025-12-05 21:01:43.765951028 +0000 UTC m=+1174.233258516" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.784701 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.784669958 podStartE2EDuration="9.784669958s" podCreationTimestamp="2025-12-05 21:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:43.781490758 +0000 UTC m=+1174.248798246" watchObservedRunningTime="2025-12-05 21:01:43.784669958 +0000 UTC m=+1174.251977446" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.806989 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.806971157 podStartE2EDuration="10.806971157s" podCreationTimestamp="2025-12-05 21:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:43.802973417 +0000 UTC m=+1174.270280905" watchObservedRunningTime="2025-12-05 21:01:43.806971157 +0000 UTC m=+1174.274278645" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.852061 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8378d3-98bb-4713-8bc6-9527680b5b5e" path="/var/lib/kubelet/pods/6a8378d3-98bb-4713-8bc6-9527680b5b5e/volumes" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.852946 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ed80ef-e29b-4aeb-b154-18989e8cbb86" path="/var/lib/kubelet/pods/a4ed80ef-e29b-4aeb-b154-18989e8cbb86/volumes" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.960856 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:43 crc kubenswrapper[4747]: I1205 21:01:43.960900 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.007305 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.014561 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.477990 4747 scope.go:117] "RemoveContainer" containerID="5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.554637 4747 scope.go:117] "RemoveContainer" containerID="8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6" Dec 05 21:01:44 crc kubenswrapper[4747]: E1205 21:01:44.559998 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6\": container with ID starting with 8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6 not found: ID does not exist" containerID="8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560038 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6"} err="failed to get container status \"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6\": rpc error: code = NotFound desc = could not find container \"8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6\": container with ID starting with 8f21c3af66e379814297876887f7fee09964fc9de3a405943c509dc99294b2b6 not found: ID does not exist" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560065 4747 scope.go:117] "RemoveContainer" containerID="178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00" Dec 05 21:01:44 crc kubenswrapper[4747]: E1205 21:01:44.560596 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00\": container with ID starting with 178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00 not found: ID does not exist" containerID="178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560638 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00"} err="failed to get container status \"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00\": rpc error: code = NotFound desc = could not find container \"178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00\": container with ID starting with 178e0c49d4b5f8087dd3574658f56ce8a0b94f216fb8803f8d1cee7c9d6b7c00 not found: ID does not exist" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560662 4747 scope.go:117] "RemoveContainer" containerID="08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c" Dec 05 21:01:44 crc kubenswrapper[4747]: E1205 21:01:44.560944 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c\": container with ID starting with 08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c not found: ID does not exist" containerID="08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560975 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c"} err="failed to get container status \"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c\": rpc error: code = NotFound desc = could not find container \"08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c\": container with ID starting with 08feed3cbd23e0d682c531ac5b5ba22dfd47382048f20ec458d32143f1f8557c not found: ID does not exist" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.560993 4747 scope.go:117] "RemoveContainer" containerID="5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e" Dec 05 21:01:44 crc kubenswrapper[4747]: E1205 21:01:44.561981 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e\": container with ID starting with 5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e not found: ID does not exist" containerID="5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.562004 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e"} err="failed to get container status \"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e\": rpc error: code = NotFound desc = could not find container \"5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e\": container with ID starting with 5a441ed0ee436b0d27b11463e276218199b16dbbd90388a25f0d85997a54495e not found: ID does not exist" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.668340 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.668609 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.717280 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.753331 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.775496 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.775537 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.782662 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 21:01:44 crc kubenswrapper[4747]: I1205 21:01:44.782726 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.093408 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:01:45 crc kubenswrapper[4747]: W1205 21:01:45.097414 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2c67f2_cc1d_453c_b475_bd421b33d0e0.slice/crio-998762c3458651b9a90433374f7548f54b64e045c2ff940051461b7075bf057b WatchSource:0}: Error finding container 998762c3458651b9a90433374f7548f54b64e045c2ff940051461b7075bf057b: Status 404 returned error can't find the container with id 998762c3458651b9a90433374f7548f54b64e045c2ff940051461b7075bf057b Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.106028 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.802855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerStarted","Data":"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.803167 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerStarted","Data":"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.806684 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerStarted","Data":"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.806718 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerStarted","Data":"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.809133 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerStarted","Data":"998762c3458651b9a90433374f7548f54b64e045c2ff940051461b7075bf057b"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.814073 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerStarted","Data":"da5fc8501fba7bc707db385bce6aba964f09ddb107d30488526ab95f96cb2e21"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.814113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerStarted","Data":"e81f5c9fbda1eb9ab9778805ee52bf5015c30d3532093b0e7bc22a3a44f23a55"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.814124 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerStarted","Data":"421b7283cd0cebcf9d4eddc8c84a78ffccc619d963196165ccc2c347799a9b4c"} Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.814971 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.815090 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.822713 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" podStartSLOduration=3.084766572 podStartE2EDuration="6.822701446s" podCreationTimestamp="2025-12-05 21:01:39 +0000 UTC" firstStartedPulling="2025-12-05 21:01:40.837020289 +0000 UTC m=+1171.304327777" lastFinishedPulling="2025-12-05 21:01:44.574955153 +0000 UTC m=+1175.042262651" observedRunningTime="2025-12-05 21:01:45.821928847 +0000 UTC m=+1176.289236335" watchObservedRunningTime="2025-12-05 21:01:45.822701446 +0000 UTC m=+1176.290008934" Dec 05 21:01:45 crc kubenswrapper[4747]: I1205 21:01:45.864180 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76c7dd97d4-lfdpd" podStartSLOduration=2.864160916 podStartE2EDuration="2.864160916s" podCreationTimestamp="2025-12-05 21:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:45.853855348 +0000 UTC m=+1176.321162836" watchObservedRunningTime="2025-12-05 21:01:45.864160916 +0000 UTC m=+1176.331468404" Dec 05 21:01:46 crc kubenswrapper[4747]: I1205 21:01:46.829691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerStarted","Data":"4dfc90b1a548dce591161e0499ca0a5c0bf6c2c54a96bca69d5b7f7c8f7124d9"} Dec 05 21:01:46 crc kubenswrapper[4747]: I1205 21:01:46.830375 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerStarted","Data":"159e15cf0f4bae5b6e4953923e37d9c95cd1de3fe1ff3af2797a98447a2358c8"} Dec 05 21:01:47 crc kubenswrapper[4747]: I1205 21:01:47.840345 4747 generic.go:334] "Generic (PLEG): container finished" podID="24ec3037-277a-454c-b807-ceb5e626e724" containerID="579b667a2f6822473f1c5251f59efb929a78e15bbb88f0ffcdc18baa8dad0012" exitCode=0 Dec 05 21:01:47 crc kubenswrapper[4747]: I1205 21:01:47.873923 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rmcvl" event={"ID":"24ec3037-277a-454c-b807-ceb5e626e724","Type":"ContainerDied","Data":"579b667a2f6822473f1c5251f59efb929a78e15bbb88f0ffcdc18baa8dad0012"} Dec 05 21:01:47 crc kubenswrapper[4747]: I1205 21:01:47.873969 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerStarted","Data":"8da4a5e81776bc3fdbb9ca26640f1c27486505b16bd5fabadf6d2b96c6aadb0d"} Dec 05 21:01:47 crc kubenswrapper[4747]: I1205 21:01:47.918647 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" podStartSLOduration=5.073673629 podStartE2EDuration="8.918629008s" podCreationTimestamp="2025-12-05 21:01:39 +0000 UTC" firstStartedPulling="2025-12-05 21:01:40.735366029 +0000 UTC m=+1171.202673517" lastFinishedPulling="2025-12-05 21:01:44.580321368 +0000 UTC m=+1175.047628896" observedRunningTime="2025-12-05 21:01:45.873933852 +0000 UTC m=+1176.341241340" watchObservedRunningTime="2025-12-05 21:01:47.918629008 +0000 UTC m=+1178.385936506" Dec 05 21:01:48 crc kubenswrapper[4747]: I1205 21:01:48.857294 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerStarted","Data":"3da8fe2161e2c649bc0e9d49150d3f9f8eee33a84e00f171b9e318d6bb0a574c"} Dec 05 21:01:48 crc kubenswrapper[4747]: I1205 21:01:48.880860 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.448711304 podStartE2EDuration="5.880843477s" podCreationTimestamp="2025-12-05 21:01:43 +0000 UTC" firstStartedPulling="2025-12-05 21:01:45.099942974 +0000 UTC m=+1175.567250462" lastFinishedPulling="2025-12-05 21:01:48.532075147 +0000 UTC m=+1178.999382635" observedRunningTime="2025-12-05 21:01:48.874673782 +0000 UTC m=+1179.341981280" watchObservedRunningTime="2025-12-05 21:01:48.880843477 +0000 UTC m=+1179.348150965" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.057212 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.298528 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.385865 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.385961 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.386098 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.386130 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.386214 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.386348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.386405 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdtnr\" (UniqueName: \"kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr\") pod \"24ec3037-277a-454c-b807-ceb5e626e724\" (UID: \"24ec3037-277a-454c-b807-ceb5e626e724\") " Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.387531 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec3037-277a-454c-b807-ceb5e626e724-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.392488 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.392693 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr" (OuterVolumeSpecName: "kube-api-access-tdtnr") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "kube-api-access-tdtnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.404770 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts" (OuterVolumeSpecName: "scripts") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.424399 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.431317 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data" (OuterVolumeSpecName: "config-data") pod "24ec3037-277a-454c-b807-ceb5e626e724" (UID: "24ec3037-277a-454c-b807-ceb5e626e724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.490717 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.490753 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.490763 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.490777 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdtnr\" (UniqueName: \"kubernetes.io/projected/24ec3037-277a-454c-b807-ceb5e626e724-kube-api-access-tdtnr\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.490789 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24ec3037-277a-454c-b807-ceb5e626e724-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.869624 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rmcvl" event={"ID":"24ec3037-277a-454c-b807-ceb5e626e724","Type":"ContainerDied","Data":"3c1d282941cae44b2658300e4c1257f7c3f6d5cdcab1e62ef316f4879260c8c3"} Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.869673 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1d282941cae44b2658300e4c1257f7c3f6d5cdcab1e62ef316f4879260c8c3" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.869668 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rmcvl" Dec 05 21:01:49 crc kubenswrapper[4747]: I1205 21:01:49.869905 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.245542 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:01:50 crc kubenswrapper[4747]: E1205 21:01:50.246214 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ec3037-277a-454c-b807-ceb5e626e724" containerName="cinder-db-sync" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.246226 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ec3037-277a-454c-b807-ceb5e626e724" containerName="cinder-db-sync" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.246449 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ec3037-277a-454c-b807-ceb5e626e724" containerName="cinder-db-sync" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.248193 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.252275 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.252536 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fck4g" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.252805 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.252924 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.257724 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.258953 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="dnsmasq-dns" containerID="cri-o://df61ca7201a9de81e0bc350788b8b187ab8d6915dc828fd7135839c8a8fb466e" gracePeriod=10 Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.269760 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.288931 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.307863 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.307910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.307998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45x6\" (UniqueName: \"kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.308042 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.308082 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.308100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.341750 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.345045 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.371707 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.375713 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409221 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409262 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409305 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409401 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45x6\" (UniqueName: \"kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409422 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dzj\" (UniqueName: \"kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409479 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409533 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.409563 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.414056 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.414823 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.422429 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.423395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.446261 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45x6\" (UniqueName: \"kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.451383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.455076 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520116 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520209 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520254 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520281 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.520372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dzj\" (UniqueName: \"kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.522782 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.522825 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.523521 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.525050 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.548827 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.570050 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.583197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.584710 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dzj\" (UniqueName: \"kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj\") pod \"dnsmasq-dns-5c77d8b67c-rtlsm\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.624119 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.625055 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.634688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.739923 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dfd\" (UniqueName: \"kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.739970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.739998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.740023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.740069 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.740096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.740152 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848767 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92dfd\" (UniqueName: \"kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848854 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848887 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848907 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848926 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.848961 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.849386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.849687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.853238 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.854004 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.857067 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.861465 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.871135 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dfd\" (UniqueName: \"kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd\") pod \"cinder-api-0\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.882966 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.952841 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.965791 4747 generic.go:334] "Generic (PLEG): container finished" podID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerID="df61ca7201a9de81e0bc350788b8b187ab8d6915dc828fd7135839c8a8fb466e" exitCode=0 Dec 05 21:01:50 crc kubenswrapper[4747]: I1205 21:01:50.966226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" event={"ID":"617e0144-2c3d-4b9d-9fef-d976b607b1cc","Type":"ContainerDied","Data":"df61ca7201a9de81e0bc350788b8b187ab8d6915dc828fd7135839c8a8fb466e"} Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.168841 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267159 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267231 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt49m\" (UniqueName: \"kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267315 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267352 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267389 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.267439 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0\") pod \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\" (UID: \"617e0144-2c3d-4b9d-9fef-d976b607b1cc\") " Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.274014 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m" (OuterVolumeSpecName: "kube-api-access-qt49m") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "kube-api-access-qt49m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.334441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config" (OuterVolumeSpecName: "config") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.335518 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.364118 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.367038 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.369399 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt49m\" (UniqueName: \"kubernetes.io/projected/617e0144-2c3d-4b9d-9fef-d976b607b1cc-kube-api-access-qt49m\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.369415 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.369424 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.369432 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.369441 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.377101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "617e0144-2c3d-4b9d-9fef-d976b607b1cc" (UID: "617e0144-2c3d-4b9d-9fef-d976b607b1cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.391015 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.471729 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617e0144-2c3d-4b9d-9fef-d976b607b1cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.585111 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:51 crc kubenswrapper[4747]: W1205 21:01:51.589168 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7e01f5_9081_41be_a57c_538a0fce4a0a.slice/crio-8a0ceb0052ab85d06874f72c1267cca9b7be167f1eca5a332d44a9b8c0e36ab4 WatchSource:0}: Error finding container 8a0ceb0052ab85d06874f72c1267cca9b7be167f1eca5a332d44a9b8c0e36ab4: Status 404 returned error can't find the container with id 8a0ceb0052ab85d06874f72c1267cca9b7be167f1eca5a332d44a9b8c0e36ab4 Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.734065 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:01:51 crc kubenswrapper[4747]: I1205 21:01:51.858927 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.007770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerStarted","Data":"6e6feba458a64b2ad57e9a62a08a216c25f87330c4daed332beb311ba73cb344"} Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.015837 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" event={"ID":"617e0144-2c3d-4b9d-9fef-d976b607b1cc","Type":"ContainerDied","Data":"19d820c6974c081c4241ba50aac020db3334cfa3d1f7378efb90dfc1bde3fe82"} Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.015888 4747 scope.go:117] "RemoveContainer" containerID="df61ca7201a9de81e0bc350788b8b187ab8d6915dc828fd7135839c8a8fb466e" Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.016013 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-c6rkl" Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.026034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerStarted","Data":"8a0ceb0052ab85d06874f72c1267cca9b7be167f1eca5a332d44a9b8c0e36ab4"} Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.046227 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.048462 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" event={"ID":"b96dbb40-2f15-49dc-afc0-82c16301d001","Type":"ContainerStarted","Data":"9a5f11613bb63c2b64be019d8de1a3cf9fe38ac202de3dd2cd4ec75cc8d5763c"} Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.062098 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-c6rkl"] Dec 05 21:01:52 crc kubenswrapper[4747]: I1205 21:01:52.179426 4747 scope.go:117] "RemoveContainer" containerID="0ea48b1283cad72e00dc77a8390ee5a66b0c45bfc28e53a617d6cc8c8642958a" Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.073754 4747 generic.go:334] "Generic (PLEG): container finished" podID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerID="549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82" exitCode=0 Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.074236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" event={"ID":"b96dbb40-2f15-49dc-afc0-82c16301d001","Type":"ContainerDied","Data":"549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82"} Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.119292 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerStarted","Data":"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506"} Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.272153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.286166 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.329716 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:53 crc kubenswrapper[4747]: I1205 21:01:53.865681 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" path="/var/lib/kubelet/pods/617e0144-2c3d-4b9d-9fef-d976b607b1cc/volumes" Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.146455 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" event={"ID":"b96dbb40-2f15-49dc-afc0-82c16301d001","Type":"ContainerStarted","Data":"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd"} Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.147796 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.149052 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerStarted","Data":"957dfd91459e98ff16cab71b4e5854ca0457438d8b79ef2c2c457ed49238cd60"} Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.154501 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api-log" containerID="cri-o://268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" gracePeriod=30 Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.154713 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerStarted","Data":"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897"} Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.154755 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.154786 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api" containerID="cri-o://a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" gracePeriod=30 Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.200799 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" podStartSLOduration=4.200775321 podStartE2EDuration="4.200775321s" podCreationTimestamp="2025-12-05 21:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:54.16926379 +0000 UTC m=+1184.636571278" watchObservedRunningTime="2025-12-05 21:01:54.200775321 +0000 UTC m=+1184.668082819" Dec 05 21:01:54 crc kubenswrapper[4747]: I1205 21:01:54.218192 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.218169007 podStartE2EDuration="4.218169007s" podCreationTimestamp="2025-12-05 21:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:54.198681278 +0000 UTC m=+1184.665988776" watchObservedRunningTime="2025-12-05 21:01:54.218169007 +0000 UTC m=+1184.685476495" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.071684 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179421 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerID="a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" exitCode=0 Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179460 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerID="268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" exitCode=143 Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179478 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179518 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerDied","Data":"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897"} Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerDied","Data":"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506"} Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179607 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1f7e01f5-9081-41be-a57c-538a0fce4a0a","Type":"ContainerDied","Data":"8a0ceb0052ab85d06874f72c1267cca9b7be167f1eca5a332d44a9b8c0e36ab4"} Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.179629 4747 scope.go:117] "RemoveContainer" containerID="a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.182356 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerStarted","Data":"3a64b019db1e8660e005cd260d7129728a274c253b3f5c5de0359de57acfa85f"} Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184476 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184574 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184636 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184735 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92dfd\" (UniqueName: \"kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184775 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184857 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.184887 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts\") pod \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\" (UID: \"1f7e01f5-9081-41be-a57c-538a0fce4a0a\") " Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.185015 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs" (OuterVolumeSpecName: "logs") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.185244 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.185331 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f7e01f5-9081-41be-a57c-538a0fce4a0a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.185351 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f7e01f5-9081-41be-a57c-538a0fce4a0a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.191844 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd" (OuterVolumeSpecName: "kube-api-access-92dfd") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "kube-api-access-92dfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.200729 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts" (OuterVolumeSpecName: "scripts") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.222821 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.258681 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data" (OuterVolumeSpecName: "config-data") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.265298 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f7e01f5-9081-41be-a57c-538a0fce4a0a" (UID: "1f7e01f5-9081-41be-a57c-538a0fce4a0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.286760 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.286794 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.286805 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.286813 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92dfd\" (UniqueName: \"kubernetes.io/projected/1f7e01f5-9081-41be-a57c-538a0fce4a0a-kube-api-access-92dfd\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.286821 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f7e01f5-9081-41be-a57c-538a0fce4a0a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.306018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.331122 4747 scope.go:117] "RemoveContainer" containerID="268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.343451 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.462374204 podStartE2EDuration="5.343433237s" podCreationTimestamp="2025-12-05 21:01:50 +0000 UTC" firstStartedPulling="2025-12-05 21:01:51.396742945 +0000 UTC m=+1181.864050433" lastFinishedPulling="2025-12-05 21:01:52.277801978 +0000 UTC m=+1182.745109466" observedRunningTime="2025-12-05 21:01:55.211916207 +0000 UTC m=+1185.679223695" watchObservedRunningTime="2025-12-05 21:01:55.343433237 +0000 UTC m=+1185.810740725" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.352299 4747 scope.go:117] "RemoveContainer" containerID="a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.352802 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897\": container with ID starting with a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897 not found: ID does not exist" containerID="a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.352907 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897"} err="failed to get container status \"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897\": rpc error: code = NotFound desc = could not find container \"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897\": container with ID starting with a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897 not found: ID does not exist" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.352984 4747 scope.go:117] "RemoveContainer" containerID="268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.356032 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506\": container with ID starting with 268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506 not found: ID does not exist" containerID="268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.356078 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506"} err="failed to get container status \"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506\": rpc error: code = NotFound desc = could not find container \"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506\": container with ID starting with 268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506 not found: ID does not exist" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.356106 4747 scope.go:117] "RemoveContainer" containerID="a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.356459 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897"} err="failed to get container status \"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897\": rpc error: code = NotFound desc = could not find container \"a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897\": container with ID starting with a9bcda7c91dde3e7789fb714d6a73e11567b5aa5ae009e998a4d2e1890a8b897 not found: ID does not exist" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.356500 4747 scope.go:117] "RemoveContainer" containerID="268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.356850 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506"} err="failed to get container status \"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506\": rpc error: code = NotFound desc = could not find container \"268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506\": container with ID starting with 268a5cdd60011742a1ba705828bce322b59cf08010b2d8211e5cae3992bb3506 not found: ID does not exist" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.522319 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.530608 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.554783 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.555160 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api-log" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555176 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api-log" Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.555192 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="init" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555198 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="init" Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.555213 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="dnsmasq-dns" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555218 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="dnsmasq-dns" Dec 05 21:01:55 crc kubenswrapper[4747]: E1205 21:01:55.555234 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555240 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555412 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555429 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" containerName="cinder-api-log" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.555440 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="617e0144-2c3d-4b9d-9fef-d976b607b1cc" containerName="dnsmasq-dns" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.556350 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.561812 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.562047 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.562174 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.574216 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.623107 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.626077 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.671341 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.671599 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64876474-nwc47" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api-log" containerID="cri-o://48d16915b14bf258e2e2ea0934a538dc18c8aaa5a1ad5fd7f78335e7afa917f1" gracePeriod=30 Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.672044 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64876474-nwc47" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api" containerID="cri-o://79c9846b3460b979f45bb6dd58d4e45c8f431c03eb72863e97310a16cb1f51bc" gracePeriod=30 Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.702516 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67vp\" (UniqueName: \"kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.702934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.702958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.702980 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.703023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.703039 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.703078 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.703102 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.703152 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815144 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815212 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67vp\" (UniqueName: \"kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815318 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815335 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815398 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.815494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.817190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.820630 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.821488 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.821998 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.824686 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.826278 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.829526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.837949 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67vp\" (UniqueName: \"kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp\") pod \"cinder-api-0\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " pod="openstack/cinder-api-0" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.861563 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7e01f5-9081-41be-a57c-538a0fce4a0a" path="/var/lib/kubelet/pods/1f7e01f5-9081-41be-a57c-538a0fce4a0a/volumes" Dec 05 21:01:55 crc kubenswrapper[4747]: I1205 21:01:55.874108 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:01:56 crc kubenswrapper[4747]: I1205 21:01:56.203713 4747 generic.go:334] "Generic (PLEG): container finished" podID="8539d901-6027-49b3-9018-f18f763260c7" containerID="48d16915b14bf258e2e2ea0934a538dc18c8aaa5a1ad5fd7f78335e7afa917f1" exitCode=143 Dec 05 21:01:56 crc kubenswrapper[4747]: I1205 21:01:56.203961 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerDied","Data":"48d16915b14bf258e2e2ea0934a538dc18c8aaa5a1ad5fd7f78335e7afa917f1"} Dec 05 21:01:56 crc kubenswrapper[4747]: I1205 21:01:56.378665 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:01:56 crc kubenswrapper[4747]: W1205 21:01:56.379567 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62399e6a_577a_4d10_b057_49c4bae7a172.slice/crio-d0780bb0583af844986dba554f2c952c6bf113147c39f365faca7c60be6fb605 WatchSource:0}: Error finding container d0780bb0583af844986dba554f2c952c6bf113147c39f365faca7c60be6fb605: Status 404 returned error can't find the container with id d0780bb0583af844986dba554f2c952c6bf113147c39f365faca7c60be6fb605 Dec 05 21:01:57 crc kubenswrapper[4747]: I1205 21:01:57.229870 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerStarted","Data":"454cb64b4de071bc5408ffd45978d6472ff63dcb85495e7423c6ead0953090f8"} Dec 05 21:01:57 crc kubenswrapper[4747]: I1205 21:01:57.230211 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerStarted","Data":"d0780bb0583af844986dba554f2c952c6bf113147c39f365faca7c60be6fb605"} Dec 05 21:01:58 crc kubenswrapper[4747]: I1205 21:01:58.241628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerStarted","Data":"7ad7169da0407cc9fa1fb2412f29f1fddcb02d5a765a42193125782a41cbb169"} Dec 05 21:01:58 crc kubenswrapper[4747]: I1205 21:01:58.242124 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 21:01:58 crc kubenswrapper[4747]: I1205 21:01:58.264422 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.264403417 podStartE2EDuration="3.264403417s" podCreationTimestamp="2025-12-05 21:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:01:58.258700164 +0000 UTC m=+1188.726007652" watchObservedRunningTime="2025-12-05 21:01:58.264403417 +0000 UTC m=+1188.731710905" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.256348 4747 generic.go:334] "Generic (PLEG): container finished" podID="8539d901-6027-49b3-9018-f18f763260c7" containerID="79c9846b3460b979f45bb6dd58d4e45c8f431c03eb72863e97310a16cb1f51bc" exitCode=0 Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.256535 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerDied","Data":"79c9846b3460b979f45bb6dd58d4e45c8f431c03eb72863e97310a16cb1f51bc"} Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.256842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64876474-nwc47" event={"ID":"8539d901-6027-49b3-9018-f18f763260c7","Type":"ContainerDied","Data":"994581ed5cbd13efaeffcc2fe4f3895016f0f787129e307baeaabde214fe2fb0"} Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.256884 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994581ed5cbd13efaeffcc2fe4f3895016f0f787129e307baeaabde214fe2fb0" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.317329 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.394268 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom\") pod \"8539d901-6027-49b3-9018-f18f763260c7\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.394362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnndr\" (UniqueName: \"kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr\") pod \"8539d901-6027-49b3-9018-f18f763260c7\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.394393 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs\") pod \"8539d901-6027-49b3-9018-f18f763260c7\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.394450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data\") pod \"8539d901-6027-49b3-9018-f18f763260c7\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.394491 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle\") pod \"8539d901-6027-49b3-9018-f18f763260c7\" (UID: \"8539d901-6027-49b3-9018-f18f763260c7\") " Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.396556 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs" (OuterVolumeSpecName: "logs") pod "8539d901-6027-49b3-9018-f18f763260c7" (UID: "8539d901-6027-49b3-9018-f18f763260c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.402826 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8539d901-6027-49b3-9018-f18f763260c7" (UID: "8539d901-6027-49b3-9018-f18f763260c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.413721 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr" (OuterVolumeSpecName: "kube-api-access-tnndr") pod "8539d901-6027-49b3-9018-f18f763260c7" (UID: "8539d901-6027-49b3-9018-f18f763260c7"). InnerVolumeSpecName "kube-api-access-tnndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.425340 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8539d901-6027-49b3-9018-f18f763260c7" (UID: "8539d901-6027-49b3-9018-f18f763260c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.470750 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data" (OuterVolumeSpecName: "config-data") pod "8539d901-6027-49b3-9018-f18f763260c7" (UID: "8539d901-6027-49b3-9018-f18f763260c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.496728 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.496772 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnndr\" (UniqueName: \"kubernetes.io/projected/8539d901-6027-49b3-9018-f18f763260c7-kube-api-access-tnndr\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.496791 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8539d901-6027-49b3-9018-f18f763260c7-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.496807 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:01:59 crc kubenswrapper[4747]: I1205 21:01:59.496822 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8539d901-6027-49b3-9018-f18f763260c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.264999 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64876474-nwc47" Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.287457 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.293951 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-64876474-nwc47"] Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.885267 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.947442 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.947722 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="dnsmasq-dns" containerID="cri-o://87416f48ac873d0bcb0bd97477550cbdcc3fa51bee5a65cf47096cc9f529f7e7" gracePeriod=10 Dec 05 21:02:00 crc kubenswrapper[4747]: I1205 21:02:00.989435 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.027609 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.280658 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerID="87416f48ac873d0bcb0bd97477550cbdcc3fa51bee5a65cf47096cc9f529f7e7" exitCode=0 Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.281153 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="cinder-scheduler" containerID="cri-o://957dfd91459e98ff16cab71b4e5854ca0457438d8b79ef2c2c457ed49238cd60" gracePeriod=30 Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.280747 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" event={"ID":"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd","Type":"ContainerDied","Data":"87416f48ac873d0bcb0bd97477550cbdcc3fa51bee5a65cf47096cc9f529f7e7"} Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.281567 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="probe" containerID="cri-o://3a64b019db1e8660e005cd260d7129728a274c253b3f5c5de0359de57acfa85f" gracePeriod=30 Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.530611 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640322 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640549 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640594 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz72d\" (UniqueName: \"kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640665 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.640767 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb\") pod \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\" (UID: \"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd\") " Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.658735 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d" (OuterVolumeSpecName: "kube-api-access-wz72d") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "kube-api-access-wz72d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.701794 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.711640 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config" (OuterVolumeSpecName: "config") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.732482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.744613 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz72d\" (UniqueName: \"kubernetes.io/projected/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-kube-api-access-wz72d\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.744797 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.744855 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.744908 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.751838 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.751869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" (UID: "7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.845748 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.846108 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.849399 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8539d901-6027-49b3-9018-f18f763260c7" path="/var/lib/kubelet/pods/8539d901-6027-49b3-9018-f18f763260c7/volumes" Dec 05 21:02:01 crc kubenswrapper[4747]: I1205 21:02:01.862455 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.074686 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-646c94b678-5975b" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.129856 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-646c94b678-5975b" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.290644 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerID="3a64b019db1e8660e005cd260d7129728a274c253b3f5c5de0359de57acfa85f" exitCode=0 Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.290703 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerDied","Data":"3a64b019db1e8660e005cd260d7129728a274c253b3f5c5de0359de57acfa85f"} Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.292565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" event={"ID":"7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd","Type":"ContainerDied","Data":"e6d1b4dbe39b407287a37548543439ba028b5c4f965386970332d2e050fa8d44"} Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.292730 4747 scope.go:117] "RemoveContainer" containerID="87416f48ac873d0bcb0bd97477550cbdcc3fa51bee5a65cf47096cc9f529f7e7" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.292603 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-wznpr" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.320856 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.324701 4747 scope.go:117] "RemoveContainer" containerID="b8f28097f0504830cc1291babd60862d87ea1eabd52b29dc6474b4080c8c4480" Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.329883 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-wznpr"] Dec 05 21:02:02 crc kubenswrapper[4747]: I1205 21:02:02.618370 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:02:03 crc kubenswrapper[4747]: I1205 21:02:03.852594 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" path="/var/lib/kubelet/pods/7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd/volumes" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.066009 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.135021 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.135447 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d57b6c6f6-xw99h" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-api" containerID="cri-o://aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212" gracePeriod=30 Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.136011 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d57b6c6f6-xw99h" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-httpd" containerID="cri-o://2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4" gracePeriod=30 Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.357783 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerID="957dfd91459e98ff16cab71b4e5854ca0457438d8b79ef2c2c457ed49238cd60" exitCode=0 Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.357823 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerDied","Data":"957dfd91459e98ff16cab71b4e5854ca0457438d8b79ef2c2c457ed49238cd60"} Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.791340 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856091 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45x6\" (UniqueName: \"kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.856270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom\") pod \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\" (UID: \"1ca0ec26-7cb4-4a0b-83ad-82392245d162\") " Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.861049 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.865938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts" (OuterVolumeSpecName: "scripts") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.866057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6" (OuterVolumeSpecName: "kube-api-access-g45x6") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "kube-api-access-g45x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.868832 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.947637 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.958990 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.959021 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.959032 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g45x6\" (UniqueName: \"kubernetes.io/projected/1ca0ec26-7cb4-4a0b-83ad-82392245d162-kube-api-access-g45x6\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.959043 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:05 crc kubenswrapper[4747]: I1205 21:02:05.959051 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ca0ec26-7cb4-4a0b-83ad-82392245d162-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.002222 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data" (OuterVolumeSpecName: "config-data") pod "1ca0ec26-7cb4-4a0b-83ad-82392245d162" (UID: "1ca0ec26-7cb4-4a0b-83ad-82392245d162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.061071 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0ec26-7cb4-4a0b-83ad-82392245d162-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.370919 4747 generic.go:334] "Generic (PLEG): container finished" podID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerID="2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4" exitCode=0 Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.370993 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerDied","Data":"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4"} Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.373380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ca0ec26-7cb4-4a0b-83ad-82392245d162","Type":"ContainerDied","Data":"6e6feba458a64b2ad57e9a62a08a216c25f87330c4daed332beb311ba73cb344"} Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.373438 4747 scope.go:117] "RemoveContainer" containerID="3a64b019db1e8660e005cd260d7129728a274c253b3f5c5de0359de57acfa85f" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.373442 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.410773 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.411513 4747 scope.go:117] "RemoveContainer" containerID="957dfd91459e98ff16cab71b4e5854ca0457438d8b79ef2c2c457ed49238cd60" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.424707 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.459642 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460077 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460093 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api" Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460113 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="cinder-scheduler" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460120 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="cinder-scheduler" Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460134 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="dnsmasq-dns" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460140 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="dnsmasq-dns" Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460162 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="probe" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460168 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="probe" Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460180 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api-log" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460186 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api-log" Dec 05 21:02:06 crc kubenswrapper[4747]: E1205 21:02:06.460196 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="init" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460201 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="init" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460364 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="cinder-scheduler" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460377 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" containerName="probe" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460383 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c1e5ae8-3895-4b9a-a9ee-8df5b9ae3bdd" containerName="dnsmasq-dns" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460397 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api-log" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.460405 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8539d901-6027-49b3-9018-f18f763260c7" containerName="barbican-api" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.461454 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.467498 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.467797 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.570971 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlgwj\" (UniqueName: \"kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.571051 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.571079 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.571153 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.571187 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.571210 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlgwj\" (UniqueName: \"kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672875 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672948 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.672992 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.681033 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.684058 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.684382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.687194 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.689188 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.699749 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlgwj\" (UniqueName: \"kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj\") pod \"cinder-scheduler-0\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.794955 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.800203 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.801600 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.809260 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.809703 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.811279 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cnbf8" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.818062 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.979333 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.979812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.979839 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsc6\" (UniqueName: \"kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:06 crc kubenswrapper[4747]: I1205 21:02:06.979946 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.087537 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.088313 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.088494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.088601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.088634 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsc6\" (UniqueName: \"kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.094684 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.100936 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.110412 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsc6\" (UniqueName: \"kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6\") pod \"openstackclient\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.162175 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.195949 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.344523 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.405301 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerStarted","Data":"a9582eb062301399d5db3aba360512c046df4a6048b500b273acf878fa66b80a"} Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.493253 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.757833 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 21:02:07 crc kubenswrapper[4747]: W1205 21:02:07.776838 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a37ca14_4557_4c5c_b8b8_be6776516c83.slice/crio-c9864f20d06da2f773fe032a7c76c5f447555c10de7602a6a5254ea0ee23cc94 WatchSource:0}: Error finding container c9864f20d06da2f773fe032a7c76c5f447555c10de7602a6a5254ea0ee23cc94: Status 404 returned error can't find the container with id c9864f20d06da2f773fe032a7c76c5f447555c10de7602a6a5254ea0ee23cc94 Dec 05 21:02:07 crc kubenswrapper[4747]: I1205 21:02:07.867916 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca0ec26-7cb4-4a0b-83ad-82392245d162" path="/var/lib/kubelet/pods/1ca0ec26-7cb4-4a0b-83ad-82392245d162/volumes" Dec 05 21:02:08 crc kubenswrapper[4747]: I1205 21:02:08.470068 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 21:02:08 crc kubenswrapper[4747]: I1205 21:02:08.473694 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9a37ca14-4557-4c5c-b8b8-be6776516c83","Type":"ContainerStarted","Data":"c9864f20d06da2f773fe032a7c76c5f447555c10de7602a6a5254ea0ee23cc94"} Dec 05 21:02:08 crc kubenswrapper[4747]: I1205 21:02:08.476433 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerStarted","Data":"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9"} Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.150322 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.238209 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4f87\" (UniqueName: \"kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87\") pod \"9cd492b4-4e42-4aab-973b-95e1a363af96\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.238345 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs\") pod \"9cd492b4-4e42-4aab-973b-95e1a363af96\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.238596 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config\") pod \"9cd492b4-4e42-4aab-973b-95e1a363af96\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.238667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config\") pod \"9cd492b4-4e42-4aab-973b-95e1a363af96\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.238699 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle\") pod \"9cd492b4-4e42-4aab-973b-95e1a363af96\" (UID: \"9cd492b4-4e42-4aab-973b-95e1a363af96\") " Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.249653 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87" (OuterVolumeSpecName: "kube-api-access-l4f87") pod "9cd492b4-4e42-4aab-973b-95e1a363af96" (UID: "9cd492b4-4e42-4aab-973b-95e1a363af96"). InnerVolumeSpecName "kube-api-access-l4f87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.262753 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9cd492b4-4e42-4aab-973b-95e1a363af96" (UID: "9cd492b4-4e42-4aab-973b-95e1a363af96"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.320484 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cd492b4-4e42-4aab-973b-95e1a363af96" (UID: "9cd492b4-4e42-4aab-973b-95e1a363af96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.341827 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.341857 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.341869 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4f87\" (UniqueName: \"kubernetes.io/projected/9cd492b4-4e42-4aab-973b-95e1a363af96-kube-api-access-l4f87\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.360717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9cd492b4-4e42-4aab-973b-95e1a363af96" (UID: "9cd492b4-4e42-4aab-973b-95e1a363af96"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.404306 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config" (OuterVolumeSpecName: "config") pod "9cd492b4-4e42-4aab-973b-95e1a363af96" (UID: "9cd492b4-4e42-4aab-973b-95e1a363af96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.456408 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.456453 4747 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd492b4-4e42-4aab-973b-95e1a363af96-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.524098 4747 generic.go:334] "Generic (PLEG): container finished" podID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerID="aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212" exitCode=0 Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.524264 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d57b6c6f6-xw99h" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.524966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerDied","Data":"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212"} Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.525017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d57b6c6f6-xw99h" event={"ID":"9cd492b4-4e42-4aab-973b-95e1a363af96","Type":"ContainerDied","Data":"800d6aa5407b264351815e04e7751f094f96b1e2f1ef764dddcb632a1d335700"} Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.525034 4747 scope.go:117] "RemoveContainer" containerID="2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.543390 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerStarted","Data":"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21"} Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.579267 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.579958 4747 scope.go:117] "RemoveContainer" containerID="aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.592481 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d57b6c6f6-xw99h"] Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.612824 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.612806138 podStartE2EDuration="3.612806138s" podCreationTimestamp="2025-12-05 21:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:09.606325356 +0000 UTC m=+1200.073632834" watchObservedRunningTime="2025-12-05 21:02:09.612806138 +0000 UTC m=+1200.080113626" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.615318 4747 scope.go:117] "RemoveContainer" containerID="2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4" Dec 05 21:02:09 crc kubenswrapper[4747]: E1205 21:02:09.615917 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4\": container with ID starting with 2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4 not found: ID does not exist" containerID="2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.615961 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4"} err="failed to get container status \"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4\": rpc error: code = NotFound desc = could not find container \"2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4\": container with ID starting with 2209c7fde8e5a3a066d04f834d3a0fb8499763a0ab0c2f3c958e72e0c0fd9ae4 not found: ID does not exist" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.615982 4747 scope.go:117] "RemoveContainer" containerID="aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212" Dec 05 21:02:09 crc kubenswrapper[4747]: E1205 21:02:09.616859 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212\": container with ID starting with aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212 not found: ID does not exist" containerID="aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.616880 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212"} err="failed to get container status \"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212\": rpc error: code = NotFound desc = could not find container \"aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212\": container with ID starting with aad5cdd236db03ed4a202e85483d55b8de480645e9eb0898981b2270a8f5d212 not found: ID does not exist" Dec 05 21:02:09 crc kubenswrapper[4747]: I1205 21:02:09.856545 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" path="/var/lib/kubelet/pods/9cd492b4-4e42-4aab-973b-95e1a363af96/volumes" Dec 05 21:02:11 crc kubenswrapper[4747]: I1205 21:02:11.795414 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.124009 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:02:12 crc kubenswrapper[4747]: E1205 21:02:12.124358 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-httpd" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.124376 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-httpd" Dec 05 21:02:12 crc kubenswrapper[4747]: E1205 21:02:12.124399 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-api" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.124405 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-api" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.124653 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-api" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.124698 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd492b4-4e42-4aab-973b-95e1a363af96" containerName="neutron-httpd" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.126717 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.130016 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.130135 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.130219 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.136688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.219411 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.219713 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-central-agent" containerID="cri-o://159e15cf0f4bae5b6e4953923e37d9c95cd1de3fe1ff3af2797a98447a2358c8" gracePeriod=30 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.219830 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="sg-core" containerID="cri-o://8da4a5e81776bc3fdbb9ca26640f1c27486505b16bd5fabadf6d2b96c6aadb0d" gracePeriod=30 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.219846 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-notification-agent" containerID="cri-o://4dfc90b1a548dce591161e0499ca0a5c0bf6c2c54a96bca69d5b7f7c8f7124d9" gracePeriod=30 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.220018 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="proxy-httpd" containerID="cri-o://3da8fe2161e2c649bc0e9d49150d3f9f8eee33a84e00f171b9e318d6bb0a574c" gracePeriod=30 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.232351 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328502 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328602 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328671 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328704 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328757 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hr4z\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328806 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328834 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.328878 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430593 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430671 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hr4z\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430729 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430770 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430839 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.430947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.432070 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.432549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.437711 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.440105 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.440203 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.442307 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.445443 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.454399 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hr4z\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z\") pod \"swift-proxy-77d8776c8f-zkk2b\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.577348 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerID="3da8fe2161e2c649bc0e9d49150d3f9f8eee33a84e00f171b9e318d6bb0a574c" exitCode=0 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.577382 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerID="8da4a5e81776bc3fdbb9ca26640f1c27486505b16bd5fabadf6d2b96c6aadb0d" exitCode=2 Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.577405 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerDied","Data":"3da8fe2161e2c649bc0e9d49150d3f9f8eee33a84e00f171b9e318d6bb0a574c"} Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.577430 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerDied","Data":"8da4a5e81776bc3fdbb9ca26640f1c27486505b16bd5fabadf6d2b96c6aadb0d"} Dec 05 21:02:12 crc kubenswrapper[4747]: I1205 21:02:12.747114 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:13 crc kubenswrapper[4747]: I1205 21:02:13.346404 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:02:13 crc kubenswrapper[4747]: I1205 21:02:13.495313 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.158:3000/\": dial tcp 10.217.0.158:3000: connect: connection refused" Dec 05 21:02:13 crc kubenswrapper[4747]: I1205 21:02:13.594184 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerID="159e15cf0f4bae5b6e4953923e37d9c95cd1de3fe1ff3af2797a98447a2358c8" exitCode=0 Dec 05 21:02:13 crc kubenswrapper[4747]: I1205 21:02:13.594234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerDied","Data":"159e15cf0f4bae5b6e4953923e37d9c95cd1de3fe1ff3af2797a98447a2358c8"} Dec 05 21:02:15 crc kubenswrapper[4747]: I1205 21:02:15.617843 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerID="4dfc90b1a548dce591161e0499ca0a5c0bf6c2c54a96bca69d5b7f7c8f7124d9" exitCode=0 Dec 05 21:02:15 crc kubenswrapper[4747]: I1205 21:02:15.618005 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerDied","Data":"4dfc90b1a548dce591161e0499ca0a5c0bf6c2c54a96bca69d5b7f7c8f7124d9"} Dec 05 21:02:17 crc kubenswrapper[4747]: I1205 21:02:17.046776 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 21:02:18 crc kubenswrapper[4747]: I1205 21:02:18.785867 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:18 crc kubenswrapper[4747]: I1205 21:02:18.786399 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="29db5b68-f488-4071-9794-a6634a0a301f" containerName="kube-state-metrics" containerID="cri-o://193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd" gracePeriod=30 Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.572376 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.643896 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.658115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerStarted","Data":"19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.658207 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerStarted","Data":"aed25399bf16cd8b385ed6b41772900ad4514904a7bede7aa26a69da1fa56f94"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.660259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c2c67f2-cc1d-453c-b475-bd421b33d0e0","Type":"ContainerDied","Data":"998762c3458651b9a90433374f7548f54b64e045c2ff940051461b7075bf057b"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.660293 4747 scope.go:117] "RemoveContainer" containerID="3da8fe2161e2c649bc0e9d49150d3f9f8eee33a84e00f171b9e318d6bb0a574c" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.660399 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.661622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv789\" (UniqueName: \"kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789\") pod \"29db5b68-f488-4071-9794-a6634a0a301f\" (UID: \"29db5b68-f488-4071-9794-a6634a0a301f\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.662383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9a37ca14-4557-4c5c-b8b8-be6776516c83","Type":"ContainerStarted","Data":"50b55da7432690c6f59d396246de6b3e8a89b3871ba09c858ae1541632412c92"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.664731 4747 generic.go:334] "Generic (PLEG): container finished" podID="29db5b68-f488-4071-9794-a6634a0a301f" containerID="193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd" exitCode=2 Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.664765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29db5b68-f488-4071-9794-a6634a0a301f","Type":"ContainerDied","Data":"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.664787 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29db5b68-f488-4071-9794-a6634a0a301f","Type":"ContainerDied","Data":"e1510622689c7f0e4277052337b1437057484928c8596d03b7e27789695a709c"} Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.664847 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.665191 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789" (OuterVolumeSpecName: "kube-api-access-nv789") pod "29db5b68-f488-4071-9794-a6634a0a301f" (UID: "29db5b68-f488-4071-9794-a6634a0a301f"). InnerVolumeSpecName "kube-api-access-nv789". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.701662 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.314782455 podStartE2EDuration="13.701643792s" podCreationTimestamp="2025-12-05 21:02:06 +0000 UTC" firstStartedPulling="2025-12-05 21:02:07.788742898 +0000 UTC m=+1198.256050386" lastFinishedPulling="2025-12-05 21:02:19.175604235 +0000 UTC m=+1209.642911723" observedRunningTime="2025-12-05 21:02:19.699893758 +0000 UTC m=+1210.167201246" watchObservedRunningTime="2025-12-05 21:02:19.701643792 +0000 UTC m=+1210.168951280" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.704317 4747 scope.go:117] "RemoveContainer" containerID="8da4a5e81776bc3fdbb9ca26640f1c27486505b16bd5fabadf6d2b96c6aadb0d" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.741737 4747 scope.go:117] "RemoveContainer" containerID="4dfc90b1a548dce591161e0499ca0a5c0bf6c2c54a96bca69d5b7f7c8f7124d9" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762471 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762728 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762781 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqrcj\" (UniqueName: \"kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.762908 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle\") pod \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\" (UID: \"8c2c67f2-cc1d-453c-b475-bd421b33d0e0\") " Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.763320 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv789\" (UniqueName: \"kubernetes.io/projected/29db5b68-f488-4071-9794-a6634a0a301f-kube-api-access-nv789\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.766378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.766643 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.768170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj" (OuterVolumeSpecName: "kube-api-access-rqrcj") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "kube-api-access-rqrcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.771015 4747 scope.go:117] "RemoveContainer" containerID="159e15cf0f4bae5b6e4953923e37d9c95cd1de3fe1ff3af2797a98447a2358c8" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.771674 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts" (OuterVolumeSpecName: "scripts") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.812715 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.812780 4747 scope.go:117] "RemoveContainer" containerID="193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.834662 4747 scope.go:117] "RemoveContainer" containerID="193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd" Dec 05 21:02:19 crc kubenswrapper[4747]: E1205 21:02:19.836503 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd\": container with ID starting with 193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd not found: ID does not exist" containerID="193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.836564 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd"} err="failed to get container status \"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd\": rpc error: code = NotFound desc = could not find container \"193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd\": container with ID starting with 193b092b60f5335687151a89c75b1069ddaf792e8a326c4e3a9527fe7792c6fd not found: ID does not exist" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.860520 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.866183 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.872753 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.872861 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.872933 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.873008 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqrcj\" (UniqueName: \"kubernetes.io/projected/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-kube-api-access-rqrcj\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.873067 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.906647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data" (OuterVolumeSpecName: "config-data") pod "8c2c67f2-cc1d-453c-b475-bd421b33d0e0" (UID: "8c2c67f2-cc1d-453c-b475-bd421b33d0e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.975098 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2c67f2-cc1d-453c-b475-bd421b33d0e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.986783 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:19 crc kubenswrapper[4747]: I1205 21:02:19.994634 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.002353 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.016593 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: E1205 21:02:20.016986 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="sg-core" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017003 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="sg-core" Dec 05 21:02:20 crc kubenswrapper[4747]: E1205 21:02:20.017025 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-central-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017031 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-central-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: E1205 21:02:20.017043 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29db5b68-f488-4071-9794-a6634a0a301f" containerName="kube-state-metrics" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017051 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="29db5b68-f488-4071-9794-a6634a0a301f" containerName="kube-state-metrics" Dec 05 21:02:20 crc kubenswrapper[4747]: E1205 21:02:20.017072 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="proxy-httpd" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017078 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="proxy-httpd" Dec 05 21:02:20 crc kubenswrapper[4747]: E1205 21:02:20.017087 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-notification-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017092 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-notification-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017254 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="29db5b68-f488-4071-9794-a6634a0a301f" containerName="kube-state-metrics" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017271 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="proxy-httpd" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017281 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-central-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017295 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="sg-core" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017303 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" containerName="ceilometer-notification-agent" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.017856 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.020086 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.020359 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4ccc9" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.020766 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.029415 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.043557 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.074415 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.076490 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.076543 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.076569 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsghz\" (UniqueName: \"kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.076662 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.076716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.078988 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.079274 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.091859 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.178627 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bncq7\" (UniqueName: \"kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179001 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179039 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179126 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179239 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179289 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179325 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179504 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179567 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179610 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsghz\" (UniqueName: \"kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.179637 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.182731 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.182762 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.184604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.195119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsghz\" (UniqueName: \"kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz\") pod \"kube-state-metrics-0\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281431 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281612 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bncq7\" (UniqueName: \"kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281646 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281750 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281787 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.281820 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.282353 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.282529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.285920 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.286395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.287149 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.299365 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.302798 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bncq7\" (UniqueName: \"kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7\") pod \"ceilometer-0\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.451253 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.464625 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.694139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerStarted","Data":"0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569"} Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.694505 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.694525 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.966994 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-77d8776c8f-zkk2b" podStartSLOduration=8.966974295 podStartE2EDuration="8.966974295s" podCreationTimestamp="2025-12-05 21:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:20.720953853 +0000 UTC m=+1211.188261341" watchObservedRunningTime="2025-12-05 21:02:20.966974295 +0000 UTC m=+1211.434281783" Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.977595 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:02:20 crc kubenswrapper[4747]: W1205 21:02:20.980075 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod278246ea_04b9_4694_bb8c_5c67503c17e4.slice/crio-e2e4cecdbccdb96c0495fdf5142054930924e66fad5706bed33bed13173f721b WatchSource:0}: Error finding container e2e4cecdbccdb96c0495fdf5142054930924e66fad5706bed33bed13173f721b: Status 404 returned error can't find the container with id e2e4cecdbccdb96c0495fdf5142054930924e66fad5706bed33bed13173f721b Dec 05 21:02:20 crc kubenswrapper[4747]: I1205 21:02:20.992088 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:21 crc kubenswrapper[4747]: W1205 21:02:21.004992 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86843270_f9ed_4e4f_97a5_d6a433f7d5fb.slice/crio-b455c133b9429830c4ed31266e765e10ac3a258608e3bd78968436c09dba1d18 WatchSource:0}: Error finding container b455c133b9429830c4ed31266e765e10ac3a258608e3bd78968436c09dba1d18: Status 404 returned error can't find the container with id b455c133b9429830c4ed31266e765e10ac3a258608e3bd78968436c09dba1d18 Dec 05 21:02:21 crc kubenswrapper[4747]: I1205 21:02:21.478122 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:21 crc kubenswrapper[4747]: I1205 21:02:21.712809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"278246ea-04b9-4694-bb8c-5c67503c17e4","Type":"ContainerStarted","Data":"e2e4cecdbccdb96c0495fdf5142054930924e66fad5706bed33bed13173f721b"} Dec 05 21:02:21 crc kubenswrapper[4747]: I1205 21:02:21.714344 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerStarted","Data":"b455c133b9429830c4ed31266e765e10ac3a258608e3bd78968436c09dba1d18"} Dec 05 21:02:21 crc kubenswrapper[4747]: I1205 21:02:21.856277 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29db5b68-f488-4071-9794-a6634a0a301f" path="/var/lib/kubelet/pods/29db5b68-f488-4071-9794-a6634a0a301f/volumes" Dec 05 21:02:21 crc kubenswrapper[4747]: I1205 21:02:21.856980 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2c67f2-cc1d-453c-b475-bd421b33d0e0" path="/var/lib/kubelet/pods/8c2c67f2-cc1d-453c-b475-bd421b33d0e0/volumes" Dec 05 21:02:23 crc kubenswrapper[4747]: I1205 21:02:23.730871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"278246ea-04b9-4694-bb8c-5c67503c17e4","Type":"ContainerStarted","Data":"f2c7f412fda78eb2b77cfb8231215251635941d62951bcccf4806b1cdf12e6fe"} Dec 05 21:02:23 crc kubenswrapper[4747]: I1205 21:02:23.731395 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 21:02:23 crc kubenswrapper[4747]: I1205 21:02:23.733198 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerStarted","Data":"f0f23a162a0a12496d2428ad511ccad3929d0be06d0ab35d62d1db5addc99899"} Dec 05 21:02:23 crc kubenswrapper[4747]: I1205 21:02:23.751925 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.716701215 podStartE2EDuration="4.751903323s" podCreationTimestamp="2025-12-05 21:02:19 +0000 UTC" firstStartedPulling="2025-12-05 21:02:20.988064955 +0000 UTC m=+1211.455372433" lastFinishedPulling="2025-12-05 21:02:23.023267053 +0000 UTC m=+1213.490574541" observedRunningTime="2025-12-05 21:02:23.744459156 +0000 UTC m=+1214.211766654" watchObservedRunningTime="2025-12-05 21:02:23.751903323 +0000 UTC m=+1214.219210811" Dec 05 21:02:24 crc kubenswrapper[4747]: I1205 21:02:24.745087 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerStarted","Data":"c8a85e63aaac5a2e0b1cd561359ab8b026136d11e69a5e4ff83eb273fadd4159"} Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.755155 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerStarted","Data":"005d6fd2b48fb212ab713eb292c2d2a209580c88b0dd8e57c1e79a0ccef00b76"} Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.756865 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6gvhk"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.758242 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.771163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6gvhk"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.861652 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5kkvt"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.863033 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.877593 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5kkvt"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.890075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nh2m\" (UniqueName: \"kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.890130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.893931 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-48b7-account-create-update-cf96q"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.895036 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.898411 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.919076 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48b7-account-create-update-cf96q"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.981050 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sv2s2"] Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.982287 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991509 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hd8\" (UniqueName: \"kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991553 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nh2m\" (UniqueName: \"kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991574 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7tc\" (UniqueName: \"kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.991777 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.992341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:25 crc kubenswrapper[4747]: I1205 21:02:25.999303 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sv2s2"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.017254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nh2m\" (UniqueName: \"kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m\") pod \"nova-api-db-create-6gvhk\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.081044 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7tc\" (UniqueName: \"kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094669 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgxwf\" (UniqueName: \"kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094686 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094716 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hd8\" (UniqueName: \"kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.094769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.095602 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.095713 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.131170 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7tc\" (UniqueName: \"kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc\") pod \"nova-api-48b7-account-create-update-cf96q\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.145404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hd8\" (UniqueName: \"kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8\") pod \"nova-cell0-db-create-5kkvt\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.180218 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-33f5-account-create-update-dhmm4"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.180817 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.181527 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.193818 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.196413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgxwf\" (UniqueName: \"kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.196516 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.197366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.202852 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-33f5-account-create-update-dhmm4"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.214500 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.235741 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgxwf\" (UniqueName: \"kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf\") pod \"nova-cell1-db-create-sv2s2\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.301404 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.302925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhckd\" (UniqueName: \"kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.303193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.375396 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a003-account-create-update-8kxhx"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.377235 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.379533 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.399171 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a003-account-create-update-8kxhx"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.406954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhckd\" (UniqueName: \"kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.407115 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.408911 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.431393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhckd\" (UniqueName: \"kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd\") pod \"nova-cell0-33f5-account-create-update-dhmm4\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.508726 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tw9\" (UniqueName: \"kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.509217 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.610561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.610681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57tw9\" (UniqueName: \"kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.612090 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.628268 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.643424 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tw9\" (UniqueName: \"kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9\") pod \"nova-cell1-a003-account-create-update-8kxhx\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.747297 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.792830 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6gvhk"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.814528 4747 generic.go:334] "Generic (PLEG): container finished" podID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerID="8b7ddb6aef7dd916eded55b5858f33be9335403e3bf15a16314b08a7fd84918c" exitCode=1 Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.814632 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerDied","Data":"8b7ddb6aef7dd916eded55b5858f33be9335403e3bf15a16314b08a7fd84918c"} Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.814775 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-central-agent" containerID="cri-o://f0f23a162a0a12496d2428ad511ccad3929d0be06d0ab35d62d1db5addc99899" gracePeriod=30 Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.814935 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="sg-core" containerID="cri-o://005d6fd2b48fb212ab713eb292c2d2a209580c88b0dd8e57c1e79a0ccef00b76" gracePeriod=30 Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.815017 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-notification-agent" containerID="cri-o://c8a85e63aaac5a2e0b1cd561359ab8b026136d11e69a5e4ff83eb273fadd4159" gracePeriod=30 Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.900448 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5kkvt"] Dec 05 21:02:26 crc kubenswrapper[4747]: I1205 21:02:26.979090 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48b7-account-create-update-cf96q"] Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.013754 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sv2s2"] Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.179362 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-33f5-account-create-update-dhmm4"] Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.262859 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a003-account-create-update-8kxhx"] Dec 05 21:02:27 crc kubenswrapper[4747]: W1205 21:02:27.270068 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b42251a_f100_4233_91fa_be148b7c1665.slice/crio-ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0 WatchSource:0}: Error finding container ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0: Status 404 returned error can't find the container with id ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.739276 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.739570 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-log" containerID="cri-o://dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a" gracePeriod=30 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.739762 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-httpd" containerID="cri-o://6614e83c8dde1b5b152c507e4d21e58c0577b0a8e617f8e329992d641316a58b" gracePeriod=30 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.761099 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.781480 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.837763 4747 generic.go:334] "Generic (PLEG): container finished" podID="7ee21b51-8d3a-4942-b004-53d5337da918" containerID="021b4129bc60dba22de412a95d497fa278c1cabcb4adaba919eabf389ed7bba3" exitCode=0 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.837839 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kkvt" event={"ID":"7ee21b51-8d3a-4942-b004-53d5337da918","Type":"ContainerDied","Data":"021b4129bc60dba22de412a95d497fa278c1cabcb4adaba919eabf389ed7bba3"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.837864 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kkvt" event={"ID":"7ee21b51-8d3a-4942-b004-53d5337da918","Type":"ContainerStarted","Data":"cb042ebcf754209be8d4a012ab79f1ffa73ae282b24edab1d91f893e9940fa41"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.878838 4747 generic.go:334] "Generic (PLEG): container finished" podID="362c3d2b-5405-46e3-aa99-90f7b010dfd3" containerID="4a0a187e3515674049916fea84c37dc12953c3a256a7c6a965d519c2997ec515" exitCode=0 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.882818 4747 generic.go:334] "Generic (PLEG): container finished" podID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerID="005d6fd2b48fb212ab713eb292c2d2a209580c88b0dd8e57c1e79a0ccef00b76" exitCode=2 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.882886 4747 generic.go:334] "Generic (PLEG): container finished" podID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerID="c8a85e63aaac5a2e0b1cd561359ab8b026136d11e69a5e4ff83eb273fadd4159" exitCode=0 Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.882939 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sv2s2" event={"ID":"39885f1c-db92-4f28-a10c-36998e6fcda8","Type":"ContainerStarted","Data":"2ff2feb07b826f2c716adfd6ef299090c2fb91ca6c9b21907acafcff5307d232"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.882967 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sv2s2" event={"ID":"39885f1c-db92-4f28-a10c-36998e6fcda8","Type":"ContainerStarted","Data":"95042c55a9b1eb00f136f47a075dcaf2f43ff993cacaa793dd2459b5075d25fa"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.882993 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" event={"ID":"8b42251a-f100-4233-91fa-be148b7c1665","Type":"ContainerStarted","Data":"9ed4c66fc18065842866bdc3c5c5f0593f188b8cf62436b51215923e39dceb01"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.883005 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" event={"ID":"8b42251a-f100-4233-91fa-be148b7c1665","Type":"ContainerStarted","Data":"ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.883014 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6gvhk" event={"ID":"362c3d2b-5405-46e3-aa99-90f7b010dfd3","Type":"ContainerDied","Data":"4a0a187e3515674049916fea84c37dc12953c3a256a7c6a965d519c2997ec515"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.883025 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6gvhk" event={"ID":"362c3d2b-5405-46e3-aa99-90f7b010dfd3","Type":"ContainerStarted","Data":"66e5d02f9d4277fb8af0c7d1f8908b1ade6689be4c13acb3a3f77894d3dc48e6"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.883034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerDied","Data":"005d6fd2b48fb212ab713eb292c2d2a209580c88b0dd8e57c1e79a0ccef00b76"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.883045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerDied","Data":"c8a85e63aaac5a2e0b1cd561359ab8b026136d11e69a5e4ff83eb273fadd4159"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.887447 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" event={"ID":"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40","Type":"ContainerStarted","Data":"90e5436f1599ff19ca17733065aa9fa10230ec3903a4c6e14e428cc0e5d2978a"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.887486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" event={"ID":"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40","Type":"ContainerStarted","Data":"ef9b8ea90a73a554fd889ea882c6a3234d80cfc8e6e442b5954a811d4202de72"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.895113 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-sv2s2" podStartSLOduration=2.895098544 podStartE2EDuration="2.895098544s" podCreationTimestamp="2025-12-05 21:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:27.878646681 +0000 UTC m=+1218.345954179" watchObservedRunningTime="2025-12-05 21:02:27.895098544 +0000 UTC m=+1218.362406032" Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.899741 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48b7-account-create-update-cf96q" event={"ID":"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa","Type":"ContainerStarted","Data":"1e4c9c1343b7df17852b0ace458ea0a391b18fabd8537189c0bbb26e329e1df8"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.899888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48b7-account-create-update-cf96q" event={"ID":"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa","Type":"ContainerStarted","Data":"842ba65c6212350cf2cadac760592a26f7d510d8e7cb74c07feaef48cd41174b"} Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.933415 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-48b7-account-create-update-cf96q" podStartSLOduration=2.9333963450000002 podStartE2EDuration="2.933396345s" podCreationTimestamp="2025-12-05 21:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:27.929541568 +0000 UTC m=+1218.396849056" watchObservedRunningTime="2025-12-05 21:02:27.933396345 +0000 UTC m=+1218.400703833" Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.933502 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" podStartSLOduration=1.933498017 podStartE2EDuration="1.933498017s" podCreationTimestamp="2025-12-05 21:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:27.908791108 +0000 UTC m=+1218.376098596" watchObservedRunningTime="2025-12-05 21:02:27.933498017 +0000 UTC m=+1218.400805505" Dec 05 21:02:27 crc kubenswrapper[4747]: I1205 21:02:27.976203 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" podStartSLOduration=1.976186208 podStartE2EDuration="1.976186208s" podCreationTimestamp="2025-12-05 21:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:27.96945988 +0000 UTC m=+1218.436767368" watchObservedRunningTime="2025-12-05 21:02:27.976186208 +0000 UTC m=+1218.443493696" Dec 05 21:02:27 crc kubenswrapper[4747]: E1205 21:02:27.979386 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33aa1714_b696_47d8_99b9_60429eea3dec.slice/crio-dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33aa1714_b696_47d8_99b9_60429eea3dec.slice/crio-conmon-dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.692138 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.692594 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-log" containerID="cri-o://e36ce53718babfbaadff85bd5ef1243205c808b63219a4bc8372e29dd1ab380b" gracePeriod=30 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.692723 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-httpd" containerID="cri-o://7fb0031e94f15d7b0ab4d4136633b1face192cdaefa05d39733629c84dcc5f4c" gracePeriod=30 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.910312 4747 generic.go:334] "Generic (PLEG): container finished" podID="39885f1c-db92-4f28-a10c-36998e6fcda8" containerID="2ff2feb07b826f2c716adfd6ef299090c2fb91ca6c9b21907acafcff5307d232" exitCode=0 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.910470 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sv2s2" event={"ID":"39885f1c-db92-4f28-a10c-36998e6fcda8","Type":"ContainerDied","Data":"2ff2feb07b826f2c716adfd6ef299090c2fb91ca6c9b21907acafcff5307d232"} Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.912771 4747 generic.go:334] "Generic (PLEG): container finished" podID="8b42251a-f100-4233-91fa-be148b7c1665" containerID="9ed4c66fc18065842866bdc3c5c5f0593f188b8cf62436b51215923e39dceb01" exitCode=0 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.912831 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" event={"ID":"8b42251a-f100-4233-91fa-be148b7c1665","Type":"ContainerDied","Data":"9ed4c66fc18065842866bdc3c5c5f0593f188b8cf62436b51215923e39dceb01"} Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.915766 4747 generic.go:334] "Generic (PLEG): container finished" podID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerID="e36ce53718babfbaadff85bd5ef1243205c808b63219a4bc8372e29dd1ab380b" exitCode=143 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.915812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerDied","Data":"e36ce53718babfbaadff85bd5ef1243205c808b63219a4bc8372e29dd1ab380b"} Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.917153 4747 generic.go:334] "Generic (PLEG): container finished" podID="0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" containerID="90e5436f1599ff19ca17733065aa9fa10230ec3903a4c6e14e428cc0e5d2978a" exitCode=0 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.917217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" event={"ID":"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40","Type":"ContainerDied","Data":"90e5436f1599ff19ca17733065aa9fa10230ec3903a4c6e14e428cc0e5d2978a"} Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.919269 4747 generic.go:334] "Generic (PLEG): container finished" podID="c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" containerID="1e4c9c1343b7df17852b0ace458ea0a391b18fabd8537189c0bbb26e329e1df8" exitCode=0 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.919346 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48b7-account-create-update-cf96q" event={"ID":"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa","Type":"ContainerDied","Data":"1e4c9c1343b7df17852b0ace458ea0a391b18fabd8537189c0bbb26e329e1df8"} Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.921346 4747 generic.go:334] "Generic (PLEG): container finished" podID="33aa1714-b696-47d8-99b9-60429eea3dec" containerID="dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a" exitCode=143 Dec 05 21:02:28 crc kubenswrapper[4747]: I1205 21:02:28.921416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerDied","Data":"dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a"} Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.393361 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.401066 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.469280 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts\") pod \"7ee21b51-8d3a-4942-b004-53d5337da918\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.469497 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nh2m\" (UniqueName: \"kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m\") pod \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.469522 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6hd8\" (UniqueName: \"kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8\") pod \"7ee21b51-8d3a-4942-b004-53d5337da918\" (UID: \"7ee21b51-8d3a-4942-b004-53d5337da918\") " Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.469609 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts\") pod \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\" (UID: \"362c3d2b-5405-46e3-aa99-90f7b010dfd3\") " Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.469981 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ee21b51-8d3a-4942-b004-53d5337da918" (UID: "7ee21b51-8d3a-4942-b004-53d5337da918"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.470373 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee21b51-8d3a-4942-b004-53d5337da918-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.470831 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "362c3d2b-5405-46e3-aa99-90f7b010dfd3" (UID: "362c3d2b-5405-46e3-aa99-90f7b010dfd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.477129 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m" (OuterVolumeSpecName: "kube-api-access-5nh2m") pod "362c3d2b-5405-46e3-aa99-90f7b010dfd3" (UID: "362c3d2b-5405-46e3-aa99-90f7b010dfd3"). InnerVolumeSpecName "kube-api-access-5nh2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.483852 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8" (OuterVolumeSpecName: "kube-api-access-q6hd8") pod "7ee21b51-8d3a-4942-b004-53d5337da918" (UID: "7ee21b51-8d3a-4942-b004-53d5337da918"). InnerVolumeSpecName "kube-api-access-q6hd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.572453 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362c3d2b-5405-46e3-aa99-90f7b010dfd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.572493 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nh2m\" (UniqueName: \"kubernetes.io/projected/362c3d2b-5405-46e3-aa99-90f7b010dfd3-kube-api-access-5nh2m\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.572511 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6hd8\" (UniqueName: \"kubernetes.io/projected/7ee21b51-8d3a-4942-b004-53d5337da918-kube-api-access-q6hd8\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.954252 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kkvt" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.954246 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kkvt" event={"ID":"7ee21b51-8d3a-4942-b004-53d5337da918","Type":"ContainerDied","Data":"cb042ebcf754209be8d4a012ab79f1ffa73ae282b24edab1d91f893e9940fa41"} Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.954708 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb042ebcf754209be8d4a012ab79f1ffa73ae282b24edab1d91f893e9940fa41" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.956738 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6gvhk" event={"ID":"362c3d2b-5405-46e3-aa99-90f7b010dfd3","Type":"ContainerDied","Data":"66e5d02f9d4277fb8af0c7d1f8908b1ade6689be4c13acb3a3f77894d3dc48e6"} Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.956770 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6gvhk" Dec 05 21:02:29 crc kubenswrapper[4747]: I1205 21:02:29.956781 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e5d02f9d4277fb8af0c7d1f8908b1ade6689be4c13acb3a3f77894d3dc48e6" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.406240 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.482880 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.491139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhckd\" (UniqueName: \"kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd\") pod \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.491222 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts\") pod \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\" (UID: \"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.492276 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" (UID: "0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.506789 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd" (OuterVolumeSpecName: "kube-api-access-nhckd") pod "0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" (UID: "0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40"). InnerVolumeSpecName "kube-api-access-nhckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.593280 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhckd\" (UniqueName: \"kubernetes.io/projected/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-kube-api-access-nhckd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.593535 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.610017 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.615858 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.622596 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695237 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7tc\" (UniqueName: \"kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc\") pod \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695409 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57tw9\" (UniqueName: \"kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9\") pod \"8b42251a-f100-4233-91fa-be148b7c1665\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts\") pod \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\" (UID: \"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695512 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts\") pod \"8b42251a-f100-4233-91fa-be148b7c1665\" (UID: \"8b42251a-f100-4233-91fa-be148b7c1665\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695621 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgxwf\" (UniqueName: \"kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf\") pod \"39885f1c-db92-4f28-a10c-36998e6fcda8\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695664 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts\") pod \"39885f1c-db92-4f28-a10c-36998e6fcda8\" (UID: \"39885f1c-db92-4f28-a10c-36998e6fcda8\") " Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.695967 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" (UID: "c14b974a-62ea-4e9d-b0e2-9ab4031c35fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.696169 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.696440 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b42251a-f100-4233-91fa-be148b7c1665" (UID: "8b42251a-f100-4233-91fa-be148b7c1665"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.696596 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39885f1c-db92-4f28-a10c-36998e6fcda8" (UID: "39885f1c-db92-4f28-a10c-36998e6fcda8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.708613 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9" (OuterVolumeSpecName: "kube-api-access-57tw9") pod "8b42251a-f100-4233-91fa-be148b7c1665" (UID: "8b42251a-f100-4233-91fa-be148b7c1665"). InnerVolumeSpecName "kube-api-access-57tw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.709880 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf" (OuterVolumeSpecName: "kube-api-access-bgxwf") pod "39885f1c-db92-4f28-a10c-36998e6fcda8" (UID: "39885f1c-db92-4f28-a10c-36998e6fcda8"). InnerVolumeSpecName "kube-api-access-bgxwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.710029 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc" (OuterVolumeSpecName: "kube-api-access-dw7tc") pod "c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" (UID: "c14b974a-62ea-4e9d-b0e2-9ab4031c35fa"). InnerVolumeSpecName "kube-api-access-dw7tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.798165 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7tc\" (UniqueName: \"kubernetes.io/projected/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa-kube-api-access-dw7tc\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.798197 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57tw9\" (UniqueName: \"kubernetes.io/projected/8b42251a-f100-4233-91fa-be148b7c1665-kube-api-access-57tw9\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.798206 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b42251a-f100-4233-91fa-be148b7c1665-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.798217 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgxwf\" (UniqueName: \"kubernetes.io/projected/39885f1c-db92-4f28-a10c-36998e6fcda8-kube-api-access-bgxwf\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.798226 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39885f1c-db92-4f28-a10c-36998e6fcda8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.981366 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sv2s2" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.981384 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sv2s2" event={"ID":"39885f1c-db92-4f28-a10c-36998e6fcda8","Type":"ContainerDied","Data":"95042c55a9b1eb00f136f47a075dcaf2f43ff993cacaa793dd2459b5075d25fa"} Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.981771 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95042c55a9b1eb00f136f47a075dcaf2f43ff993cacaa793dd2459b5075d25fa" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.983887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" event={"ID":"8b42251a-f100-4233-91fa-be148b7c1665","Type":"ContainerDied","Data":"ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0"} Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.984050 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5234dfd438a28cb535e70225f7b7cf117ba4e4b6db0519b00fba44440016f0" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.983991 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a003-account-create-update-8kxhx" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.985632 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" event={"ID":"0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40","Type":"ContainerDied","Data":"ef9b8ea90a73a554fd889ea882c6a3234d80cfc8e6e442b5954a811d4202de72"} Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.985685 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef9b8ea90a73a554fd889ea882c6a3234d80cfc8e6e442b5954a811d4202de72" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.985899 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f5-account-create-update-dhmm4" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.992235 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48b7-account-create-update-cf96q" event={"ID":"c14b974a-62ea-4e9d-b0e2-9ab4031c35fa","Type":"ContainerDied","Data":"842ba65c6212350cf2cadac760592a26f7d510d8e7cb74c07feaef48cd41174b"} Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.992442 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842ba65c6212350cf2cadac760592a26f7d510d8e7cb74c07feaef48cd41174b" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.992610 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48b7-account-create-update-cf96q" Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.994713 4747 generic.go:334] "Generic (PLEG): container finished" podID="33aa1714-b696-47d8-99b9-60429eea3dec" containerID="6614e83c8dde1b5b152c507e4d21e58c0577b0a8e617f8e329992d641316a58b" exitCode=0 Dec 05 21:02:30 crc kubenswrapper[4747]: I1205 21:02:30.994746 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerDied","Data":"6614e83c8dde1b5b152c507e4d21e58c0577b0a8e617f8e329992d641316a58b"} Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.459955 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.525993 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526119 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526155 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526193 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526330 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526384 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526409 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5btx\" (UniqueName: \"kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx\") pod \"33aa1714-b696-47d8-99b9-60429eea3dec\" (UID: \"33aa1714-b696-47d8-99b9-60429eea3dec\") " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526655 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.526989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs" (OuterVolumeSpecName: "logs") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.527250 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.527267 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33aa1714-b696-47d8-99b9-60429eea3dec-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.536837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.539532 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts" (OuterVolumeSpecName: "scripts") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.547790 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx" (OuterVolumeSpecName: "kube-api-access-f5btx") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "kube-api-access-f5btx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.593953 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.600163 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.628727 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.628754 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.628764 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5btx\" (UniqueName: \"kubernetes.io/projected/33aa1714-b696-47d8-99b9-60429eea3dec-kube-api-access-f5btx\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.628789 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.628798 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.653073 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.672743 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data" (OuterVolumeSpecName: "config-data") pod "33aa1714-b696-47d8-99b9-60429eea3dec" (UID: "33aa1714-b696-47d8-99b9-60429eea3dec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.730031 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33aa1714-b696-47d8-99b9-60429eea3dec-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:31 crc kubenswrapper[4747]: I1205 21:02:31.730265 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.004315 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"33aa1714-b696-47d8-99b9-60429eea3dec","Type":"ContainerDied","Data":"9a5a1188d5e43c272ab75bc5698b72f8920a3117869d7317ef6a360745a4d24f"} Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.004347 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.004369 4747 scope.go:117] "RemoveContainer" containerID="6614e83c8dde1b5b152c507e4d21e58c0577b0a8e617f8e329992d641316a58b" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.014258 4747 generic.go:334] "Generic (PLEG): container finished" podID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerID="7fb0031e94f15d7b0ab4d4136633b1face192cdaefa05d39733629c84dcc5f4c" exitCode=0 Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.014292 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerDied","Data":"7fb0031e94f15d7b0ab4d4136633b1face192cdaefa05d39733629c84dcc5f4c"} Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.028684 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.036035 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.036800 4747 scope.go:117] "RemoveContainer" containerID="dbd2de635701fa4c450abf7351bc8495845da2c8c64d8a5519675791e5da279a" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060157 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060515 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b42251a-f100-4233-91fa-be148b7c1665" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060531 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b42251a-f100-4233-91fa-be148b7c1665" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060540 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060547 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060560 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee21b51-8d3a-4942-b004-53d5337da918" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060566 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee21b51-8d3a-4942-b004-53d5337da918" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060588 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39885f1c-db92-4f28-a10c-36998e6fcda8" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060595 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="39885f1c-db92-4f28-a10c-36998e6fcda8" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060613 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060619 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060631 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362c3d2b-5405-46e3-aa99-90f7b010dfd3" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060636 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="362c3d2b-5405-46e3-aa99-90f7b010dfd3" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060651 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-httpd" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060657 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-httpd" Dec 05 21:02:32 crc kubenswrapper[4747]: E1205 21:02:32.060665 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-log" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060671 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-log" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060845 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee21b51-8d3a-4942-b004-53d5337da918" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060856 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b42251a-f100-4233-91fa-be148b7c1665" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060866 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060875 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="39885f1c-db92-4f28-a10c-36998e6fcda8" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060887 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" containerName="mariadb-account-create-update" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060898 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="362c3d2b-5405-46e3-aa99-90f7b010dfd3" containerName="mariadb-database-create" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060915 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-log" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.060924 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" containerName="glance-httpd" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.061826 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.064111 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.064111 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.124402 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.135795 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.135881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.135920 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6c8w\" (UniqueName: \"kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.135945 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.135988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.136008 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.136030 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.136058 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6c8w\" (UniqueName: \"kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238477 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238565 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.238637 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.239379 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.243349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.244059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.244258 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.258200 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.264305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.266568 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6c8w\" (UniqueName: \"kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.293716 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.297073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.385143 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.548837 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686143 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686249 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686356 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686383 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686435 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686506 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8mr\" (UniqueName: \"kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.686605 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts\") pod \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\" (UID: \"a8c1d2de-aaf5-4519-a4b3-481cdb81d657\") " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.690531 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs" (OuterVolumeSpecName: "logs") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.690692 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.697778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.701308 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts" (OuterVolumeSpecName: "scripts") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.721465 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr" (OuterVolumeSpecName: "kube-api-access-4x8mr") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "kube-api-access-4x8mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.729801 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788215 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788514 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788523 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788534 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8mr\" (UniqueName: \"kubernetes.io/projected/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-kube-api-access-4x8mr\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788545 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.788554 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.789810 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data" (OuterVolumeSpecName: "config-data") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.804732 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a8c1d2de-aaf5-4519-a4b3-481cdb81d657" (UID: "a8c1d2de-aaf5-4519-a4b3-481cdb81d657"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.810756 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.889619 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.889648 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.889659 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c1d2de-aaf5-4519-a4b3-481cdb81d657-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:32 crc kubenswrapper[4747]: I1205 21:02:32.949446 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:02:32 crc kubenswrapper[4747]: W1205 21:02:32.953280 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac080b76_32d3_498c_9832_d31494c1d21f.slice/crio-a854187f0d7db09bf0941f51b4103aa49e8faef8fa46e11ee0262bf1e1319006 WatchSource:0}: Error finding container a854187f0d7db09bf0941f51b4103aa49e8faef8fa46e11ee0262bf1e1319006: Status 404 returned error can't find the container with id a854187f0d7db09bf0941f51b4103aa49e8faef8fa46e11ee0262bf1e1319006 Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.029252 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8c1d2de-aaf5-4519-a4b3-481cdb81d657","Type":"ContainerDied","Data":"cf85c72eb41757b6fcde0d1faf5b7440eb7b79c0ecfdd77d182427a7f1cafd3c"} Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.029296 4747 scope.go:117] "RemoveContainer" containerID="7fb0031e94f15d7b0ab4d4136633b1face192cdaefa05d39733629c84dcc5f4c" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.029311 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.031297 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerStarted","Data":"a854187f0d7db09bf0941f51b4103aa49e8faef8fa46e11ee0262bf1e1319006"} Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.073246 4747 scope.go:117] "RemoveContainer" containerID="e36ce53718babfbaadff85bd5ef1243205c808b63219a4bc8372e29dd1ab380b" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.081714 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.097978 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.111164 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:33 crc kubenswrapper[4747]: E1205 21:02:33.114674 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-log" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.114710 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-log" Dec 05 21:02:33 crc kubenswrapper[4747]: E1205 21:02:33.114754 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-httpd" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.114763 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-httpd" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.115281 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-httpd" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.115318 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" containerName="glance-log" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.116299 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.118417 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.119472 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.119714 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198148 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198167 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198238 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198296 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.198356 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz5t\" (UniqueName: \"kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299642 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz5t\" (UniqueName: \"kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299839 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.299860 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.300145 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.300298 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.300411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.305945 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.306196 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.306471 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.309880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.327697 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz5t\" (UniqueName: \"kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.333721 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.443708 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.851345 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33aa1714-b696-47d8-99b9-60429eea3dec" path="/var/lib/kubelet/pods/33aa1714-b696-47d8-99b9-60429eea3dec/volumes" Dec 05 21:02:33 crc kubenswrapper[4747]: I1205 21:02:33.852870 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c1d2de-aaf5-4519-a4b3-481cdb81d657" path="/var/lib/kubelet/pods/a8c1d2de-aaf5-4519-a4b3-481cdb81d657/volumes" Dec 05 21:02:34 crc kubenswrapper[4747]: I1205 21:02:34.043117 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:02:34 crc kubenswrapper[4747]: W1205 21:02:34.053409 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e4dab2_e2a7_4fe3_949a_6a31460e11ba.slice/crio-1d8195deffeb63525fad9efc5602c4d8f4f7af6ef7d284a70972270a3ff3071c WatchSource:0}: Error finding container 1d8195deffeb63525fad9efc5602c4d8f4f7af6ef7d284a70972270a3ff3071c: Status 404 returned error can't find the container with id 1d8195deffeb63525fad9efc5602c4d8f4f7af6ef7d284a70972270a3ff3071c Dec 05 21:02:34 crc kubenswrapper[4747]: I1205 21:02:34.061501 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerStarted","Data":"541e9e17ac501f698eab7f861a294ddf49e04de6ec4df244e43f29b858e30226"} Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.087417 4747 generic.go:334] "Generic (PLEG): container finished" podID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerID="f0f23a162a0a12496d2428ad511ccad3929d0be06d0ab35d62d1db5addc99899" exitCode=0 Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.087951 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerDied","Data":"f0f23a162a0a12496d2428ad511ccad3929d0be06d0ab35d62d1db5addc99899"} Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.090310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerStarted","Data":"d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d"} Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.090362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerStarted","Data":"1d8195deffeb63525fad9efc5602c4d8f4f7af6ef7d284a70972270a3ff3071c"} Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.092168 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerStarted","Data":"1bc4fa29702116af6fcd59bcbbca74418678e340a4d1a0b7cd112308e8bc0b4e"} Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.121613 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.121571008 podStartE2EDuration="3.121571008s" podCreationTimestamp="2025-12-05 21:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:35.115775182 +0000 UTC m=+1225.583082670" watchObservedRunningTime="2025-12-05 21:02:35.121571008 +0000 UTC m=+1225.588878496" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.281085 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343246 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343465 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343508 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bncq7\" (UniqueName: \"kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.343657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts\") pod \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\" (UID: \"86843270-f9ed-4e4f-97a5-d6a433f7d5fb\") " Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.344900 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.345115 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.349828 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts" (OuterVolumeSpecName: "scripts") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.349943 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7" (OuterVolumeSpecName: "kube-api-access-bncq7") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "kube-api-access-bncq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.394568 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.447906 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bncq7\" (UniqueName: \"kubernetes.io/projected/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-kube-api-access-bncq7\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.447946 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.447962 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.447973 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.447984 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.486743 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.497190 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data" (OuterVolumeSpecName: "config-data") pod "86843270-f9ed-4e4f-97a5-d6a433f7d5fb" (UID: "86843270-f9ed-4e4f-97a5-d6a433f7d5fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.550129 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:35 crc kubenswrapper[4747]: I1205 21:02:35.550285 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86843270-f9ed-4e4f-97a5-d6a433f7d5fb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.105164 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.105115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86843270-f9ed-4e4f-97a5-d6a433f7d5fb","Type":"ContainerDied","Data":"b455c133b9429830c4ed31266e765e10ac3a258608e3bd78968436c09dba1d18"} Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.105370 4747 scope.go:117] "RemoveContainer" containerID="8b7ddb6aef7dd916eded55b5858f33be9335403e3bf15a16314b08a7fd84918c" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.111141 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerStarted","Data":"90486f36ff75a740556644fedc7ad1ada74d9386b6373ae4001a5f28f3f94e44"} Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.183828 4747 scope.go:117] "RemoveContainer" containerID="005d6fd2b48fb212ab713eb292c2d2a209580c88b0dd8e57c1e79a0ccef00b76" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.229752 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.241968 4747 scope.go:117] "RemoveContainer" containerID="c8a85e63aaac5a2e0b1cd561359ab8b026136d11e69a5e4ff83eb273fadd4159" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.242067 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.250833 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:36 crc kubenswrapper[4747]: E1205 21:02:36.251156 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-notification-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251171 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-notification-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: E1205 21:02:36.251205 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="sg-core" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251211 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="sg-core" Dec 05 21:02:36 crc kubenswrapper[4747]: E1205 21:02:36.251231 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-central-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251237 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-central-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: E1205 21:02:36.251287 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="proxy-httpd" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251294 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="proxy-httpd" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251464 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="proxy-httpd" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251488 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-notification-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251508 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="ceilometer-central-agent" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.251522 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" containerName="sg-core" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.254306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.256079 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.256239 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.256346 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.261094 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.261077455 podStartE2EDuration="3.261077455s" podCreationTimestamp="2025-12-05 21:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:02:36.254963762 +0000 UTC m=+1226.722271260" watchObservedRunningTime="2025-12-05 21:02:36.261077455 +0000 UTC m=+1226.728384943" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.271831 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.271888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.271967 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969s5\" (UniqueName: \"kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.271990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.272027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.272080 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.272125 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.272166 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.274552 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.302835 4747 scope.go:117] "RemoveContainer" containerID="f0f23a162a0a12496d2428ad511ccad3929d0be06d0ab35d62d1db5addc99899" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.373997 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374382 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374502 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969s5\" (UniqueName: \"kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374562 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374633 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.374673 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.375181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.377002 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.380010 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.380124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.380696 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.380824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.382359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.389238 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969s5\" (UniqueName: \"kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5\") pod \"ceilometer-0\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.561746 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlx4b"] Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.563219 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.566818 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.567571 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.568889 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fh4pd" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.571654 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.576492 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlx4b"] Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.678716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmfz\" (UniqueName: \"kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.678769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.678795 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.678844 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.780248 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.782101 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmfz\" (UniqueName: \"kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.782161 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.782193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.786607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.787243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.789173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.797864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmfz\" (UniqueName: \"kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz\") pod \"nova-cell0-conductor-db-sync-qlx4b\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:36 crc kubenswrapper[4747]: I1205 21:02:36.898229 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:02:37 crc kubenswrapper[4747]: I1205 21:02:37.082866 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:37 crc kubenswrapper[4747]: I1205 21:02:37.131572 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerStarted","Data":"958462bf8dff2a4211870a3bfe85c4dc122030ea186835647c0f1865263a34c6"} Dec 05 21:02:37 crc kubenswrapper[4747]: I1205 21:02:37.358756 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlx4b"] Dec 05 21:02:37 crc kubenswrapper[4747]: W1205 21:02:37.366688 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a789d1_bfc6_4d28_ac8a_ece9a5d700b7.slice/crio-3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601 WatchSource:0}: Error finding container 3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601: Status 404 returned error can't find the container with id 3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601 Dec 05 21:02:37 crc kubenswrapper[4747]: I1205 21:02:37.851471 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86843270-f9ed-4e4f-97a5-d6a433f7d5fb" path="/var/lib/kubelet/pods/86843270-f9ed-4e4f-97a5-d6a433f7d5fb/volumes" Dec 05 21:02:38 crc kubenswrapper[4747]: I1205 21:02:38.167775 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerStarted","Data":"17da04f9de066d59143135ef7527dbbd2f41bba0ff31a800a053851a1e5cf083"} Dec 05 21:02:38 crc kubenswrapper[4747]: I1205 21:02:38.183727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" event={"ID":"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7","Type":"ContainerStarted","Data":"3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601"} Dec 05 21:02:38 crc kubenswrapper[4747]: I1205 21:02:38.270978 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:39 crc kubenswrapper[4747]: I1205 21:02:39.213940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerStarted","Data":"09563b78ea837afbd700e2ea7b1f8a3116a182e083835da23e527f66908d0088"} Dec 05 21:02:40 crc kubenswrapper[4747]: I1205 21:02:40.224196 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerStarted","Data":"8fd64d1e86ae4724ea23bf37094b9926d265ede7bbcc5ee2559432a27aad1187"} Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.240473 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerStarted","Data":"66f2ceb63d702f3990f837593716ed1acefdf30464afb90cc079bec38dfb1ed1"} Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.243168 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.241130 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="proxy-httpd" containerID="cri-o://66f2ceb63d702f3990f837593716ed1acefdf30464afb90cc079bec38dfb1ed1" gracePeriod=30 Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.240840 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-central-agent" containerID="cri-o://17da04f9de066d59143135ef7527dbbd2f41bba0ff31a800a053851a1e5cf083" gracePeriod=30 Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.241176 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-notification-agent" containerID="cri-o://09563b78ea837afbd700e2ea7b1f8a3116a182e083835da23e527f66908d0088" gracePeriod=30 Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.241145 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="sg-core" containerID="cri-o://8fd64d1e86ae4724ea23bf37094b9926d265ede7bbcc5ee2559432a27aad1187" gracePeriod=30 Dec 05 21:02:41 crc kubenswrapper[4747]: I1205 21:02:41.266931 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.86796454 podStartE2EDuration="5.26690695s" podCreationTimestamp="2025-12-05 21:02:36 +0000 UTC" firstStartedPulling="2025-12-05 21:02:37.079132159 +0000 UTC m=+1227.546439647" lastFinishedPulling="2025-12-05 21:02:40.478074569 +0000 UTC m=+1230.945382057" observedRunningTime="2025-12-05 21:02:41.263111784 +0000 UTC m=+1231.730419282" watchObservedRunningTime="2025-12-05 21:02:41.26690695 +0000 UTC m=+1231.734214448" Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259191 4747 generic.go:334] "Generic (PLEG): container finished" podID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerID="66f2ceb63d702f3990f837593716ed1acefdf30464afb90cc079bec38dfb1ed1" exitCode=0 Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259565 4747 generic.go:334] "Generic (PLEG): container finished" podID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerID="8fd64d1e86ae4724ea23bf37094b9926d265ede7bbcc5ee2559432a27aad1187" exitCode=2 Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259618 4747 generic.go:334] "Generic (PLEG): container finished" podID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerID="09563b78ea837afbd700e2ea7b1f8a3116a182e083835da23e527f66908d0088" exitCode=0 Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259270 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerDied","Data":"66f2ceb63d702f3990f837593716ed1acefdf30464afb90cc079bec38dfb1ed1"} Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerDied","Data":"8fd64d1e86ae4724ea23bf37094b9926d265ede7bbcc5ee2559432a27aad1187"} Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.259717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerDied","Data":"09563b78ea837afbd700e2ea7b1f8a3116a182e083835da23e527f66908d0088"} Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.386408 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.386450 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.425299 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 21:02:42 crc kubenswrapper[4747]: I1205 21:02:42.436618 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.268321 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.268757 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.445018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.446076 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.477828 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:43 crc kubenswrapper[4747]: I1205 21:02:43.484116 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:44 crc kubenswrapper[4747]: I1205 21:02:44.276771 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:44 crc kubenswrapper[4747]: I1205 21:02:44.277377 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:45 crc kubenswrapper[4747]: I1205 21:02:45.217061 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 21:02:45 crc kubenswrapper[4747]: I1205 21:02:45.257293 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.204209 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.266355 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.296552 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" event={"ID":"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7","Type":"ContainerStarted","Data":"2e6c7e067481881eb3c243ecc0d7be3e0fbce9cebb2acae99db2d255388c3986"} Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.300093 4747 generic.go:334] "Generic (PLEG): container finished" podID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerID="17da04f9de066d59143135ef7527dbbd2f41bba0ff31a800a053851a1e5cf083" exitCode=0 Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.300200 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerDied","Data":"17da04f9de066d59143135ef7527dbbd2f41bba0ff31a800a053851a1e5cf083"} Dec 05 21:02:46 crc kubenswrapper[4747]: I1205 21:02:46.325033 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" podStartSLOduration=1.9792058300000002 podStartE2EDuration="10.325013124s" podCreationTimestamp="2025-12-05 21:02:36 +0000 UTC" firstStartedPulling="2025-12-05 21:02:37.368088958 +0000 UTC m=+1227.835396446" lastFinishedPulling="2025-12-05 21:02:45.713896252 +0000 UTC m=+1236.181203740" observedRunningTime="2025-12-05 21:02:46.320504681 +0000 UTC m=+1236.787812169" watchObservedRunningTime="2025-12-05 21:02:46.325013124 +0000 UTC m=+1236.792320632" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.062322 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196336 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196410 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196461 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-969s5\" (UniqueName: \"kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196485 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196532 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196560 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196632 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.196659 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml\") pod \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\" (UID: \"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e\") " Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.197424 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.197766 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.218816 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5" (OuterVolumeSpecName: "kube-api-access-969s5") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "kube-api-access-969s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.226698 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts" (OuterVolumeSpecName: "scripts") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.232415 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.260295 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.280504 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300033 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300070 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300079 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-969s5\" (UniqueName: \"kubernetes.io/projected/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-kube-api-access-969s5\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300090 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300097 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300106 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.300138 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.312416 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.312408 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"00adbf0f-ec8b-45f8-9c0e-486b5b5a309e","Type":"ContainerDied","Data":"958462bf8dff2a4211870a3bfe85c4dc122030ea186835647c0f1865263a34c6"} Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.312612 4747 scope.go:117] "RemoveContainer" containerID="66f2ceb63d702f3990f837593716ed1acefdf30464afb90cc079bec38dfb1ed1" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.320553 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data" (OuterVolumeSpecName: "config-data") pod "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" (UID: "00adbf0f-ec8b-45f8-9c0e-486b5b5a309e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.399615 4747 scope.go:117] "RemoveContainer" containerID="8fd64d1e86ae4724ea23bf37094b9926d265ede7bbcc5ee2559432a27aad1187" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.401484 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.425788 4747 scope.go:117] "RemoveContainer" containerID="09563b78ea837afbd700e2ea7b1f8a3116a182e083835da23e527f66908d0088" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.445242 4747 scope.go:117] "RemoveContainer" containerID="17da04f9de066d59143135ef7527dbbd2f41bba0ff31a800a053851a1e5cf083" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.659352 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.689861 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697181 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:47 crc kubenswrapper[4747]: E1205 21:02:47.697661 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="proxy-httpd" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697676 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="proxy-httpd" Dec 05 21:02:47 crc kubenswrapper[4747]: E1205 21:02:47.697687 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-notification-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697696 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-notification-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: E1205 21:02:47.697723 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-central-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697731 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-central-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: E1205 21:02:47.697745 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="sg-core" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697752 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="sg-core" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697950 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-notification-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697981 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="ceilometer-central-agent" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.697994 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="sg-core" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.698013 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" containerName="proxy-httpd" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.700025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.706860 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.707173 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.707345 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.717545 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.808729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-468jg\" (UniqueName: \"kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.808808 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.808864 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.808917 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.808988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.809018 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.809039 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.809113 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.862410 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00adbf0f-ec8b-45f8-9c0e-486b5b5a309e" path="/var/lib/kubelet/pods/00adbf0f-ec8b-45f8-9c0e-486b5b5a309e/volumes" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910757 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910805 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910835 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-468jg\" (UniqueName: \"kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910907 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.910944 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.914615 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.917381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.918399 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.918873 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.929055 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.930225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.930506 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:47 crc kubenswrapper[4747]: I1205 21:02:47.933432 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-468jg\" (UniqueName: \"kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg\") pod \"ceilometer-0\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " pod="openstack/ceilometer-0" Dec 05 21:02:48 crc kubenswrapper[4747]: I1205 21:02:48.023148 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:02:48 crc kubenswrapper[4747]: I1205 21:02:48.463882 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:02:48 crc kubenswrapper[4747]: W1205 21:02:48.472571 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b422a50_036b_4eb4_80ff_51abb27f06a1.slice/crio-6a2139dae433917b57bc67d84eb6c878304e1e8f7f1b9f236a83d01cacd1b7a0 WatchSource:0}: Error finding container 6a2139dae433917b57bc67d84eb6c878304e1e8f7f1b9f236a83d01cacd1b7a0: Status 404 returned error can't find the container with id 6a2139dae433917b57bc67d84eb6c878304e1e8f7f1b9f236a83d01cacd1b7a0 Dec 05 21:02:49 crc kubenswrapper[4747]: I1205 21:02:49.334189 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerStarted","Data":"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced"} Dec 05 21:02:49 crc kubenswrapper[4747]: I1205 21:02:49.334661 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerStarted","Data":"6a2139dae433917b57bc67d84eb6c878304e1e8f7f1b9f236a83d01cacd1b7a0"} Dec 05 21:02:50 crc kubenswrapper[4747]: I1205 21:02:50.349976 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerStarted","Data":"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05"} Dec 05 21:02:51 crc kubenswrapper[4747]: I1205 21:02:51.365080 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerStarted","Data":"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413"} Dec 05 21:02:52 crc kubenswrapper[4747]: I1205 21:02:52.377354 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerStarted","Data":"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861"} Dec 05 21:02:52 crc kubenswrapper[4747]: I1205 21:02:52.377806 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 21:02:52 crc kubenswrapper[4747]: I1205 21:02:52.429662 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.03136622 podStartE2EDuration="5.429559002s" podCreationTimestamp="2025-12-05 21:02:47 +0000 UTC" firstStartedPulling="2025-12-05 21:02:48.475124245 +0000 UTC m=+1238.942431733" lastFinishedPulling="2025-12-05 21:02:51.873317037 +0000 UTC m=+1242.340624515" observedRunningTime="2025-12-05 21:02:52.408278378 +0000 UTC m=+1242.875585896" watchObservedRunningTime="2025-12-05 21:02:52.429559002 +0000 UTC m=+1242.896866530" Dec 05 21:03:10 crc kubenswrapper[4747]: I1205 21:03:10.603279 4747 generic.go:334] "Generic (PLEG): container finished" podID="e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" containerID="2e6c7e067481881eb3c243ecc0d7be3e0fbce9cebb2acae99db2d255388c3986" exitCode=0 Dec 05 21:03:10 crc kubenswrapper[4747]: I1205 21:03:10.603381 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" event={"ID":"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7","Type":"ContainerDied","Data":"2e6c7e067481881eb3c243ecc0d7be3e0fbce9cebb2acae99db2d255388c3986"} Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.053740 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.116301 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle\") pod \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.116375 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts\") pod \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.116426 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmfz\" (UniqueName: \"kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz\") pod \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.116518 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data\") pod \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\" (UID: \"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7\") " Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.122276 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz" (OuterVolumeSpecName: "kube-api-access-xbmfz") pod "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" (UID: "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7"). InnerVolumeSpecName "kube-api-access-xbmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.122560 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts" (OuterVolumeSpecName: "scripts") pod "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" (UID: "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.164056 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" (UID: "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.164880 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data" (OuterVolumeSpecName: "config-data") pod "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" (UID: "e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.218981 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.219059 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.219080 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmfz\" (UniqueName: \"kubernetes.io/projected/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-kube-api-access-xbmfz\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.219102 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.628628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" event={"ID":"e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7","Type":"ContainerDied","Data":"3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601"} Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.628703 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3021f013c9e302165c6a312160651eba0a7f6663a88bd68524913cf19aa63601" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.628646 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlx4b" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.742062 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:03:12 crc kubenswrapper[4747]: E1205 21:03:12.742703 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" containerName="nova-cell0-conductor-db-sync" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.742732 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" containerName="nova-cell0-conductor-db-sync" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.743083 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" containerName="nova-cell0-conductor-db-sync" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.743978 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.746315 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fh4pd" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.746639 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.759553 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.828886 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwsl\" (UniqueName: \"kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.829242 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.829456 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.931748 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.931961 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.932887 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwsl\" (UniqueName: \"kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.937987 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.944236 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:12 crc kubenswrapper[4747]: I1205 21:03:12.954641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwsl\" (UniqueName: \"kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl\") pod \"nova-cell0-conductor-0\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:13 crc kubenswrapper[4747]: I1205 21:03:13.082900 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:13 crc kubenswrapper[4747]: I1205 21:03:13.539675 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:03:13 crc kubenswrapper[4747]: I1205 21:03:13.642701 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0ae4d823-7941-4e9e-ae9d-cce0297e278d","Type":"ContainerStarted","Data":"0febb20caca0ece73074d303d607f4bfb22e88ee8aac27d88d6f27b67f55a0c9"} Dec 05 21:03:14 crc kubenswrapper[4747]: I1205 21:03:14.656258 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0ae4d823-7941-4e9e-ae9d-cce0297e278d","Type":"ContainerStarted","Data":"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2"} Dec 05 21:03:14 crc kubenswrapper[4747]: I1205 21:03:14.656607 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:14 crc kubenswrapper[4747]: I1205 21:03:14.684923 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.684903941 podStartE2EDuration="2.684903941s" podCreationTimestamp="2025-12-05 21:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:14.674939441 +0000 UTC m=+1265.142246959" watchObservedRunningTime="2025-12-05 21:03:14.684903941 +0000 UTC m=+1265.152211429" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.031469 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.119234 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.544891 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6pxgq"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.546276 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.549555 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.558445 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.585221 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6pxgq"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.655790 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.655841 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.655900 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8tx\" (UniqueName: \"kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.655921 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.740068 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.746285 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.748598 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.751309 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.752871 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.757963 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.758890 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8tx\" (UniqueName: \"kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.758938 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.759044 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.759064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.764446 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.765368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.767670 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.773282 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.805665 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.818708 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8tx\" (UniqueName: \"kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx\") pod \"nova-cell0-cell-mapping-6pxgq\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.860336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfb6r\" (UniqueName: \"kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.860384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.860418 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.860530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.866125 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.866258 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jsjd\" (UniqueName: \"kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.866372 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.866407 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.882376 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.898133 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.899572 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.903901 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.930246 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.951219 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.953732 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jsjd\" (UniqueName: \"kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979853 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979878 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979935 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfb6r\" (UniqueName: \"kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.979999 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.982114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.982742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.994365 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:18 crc kubenswrapper[4747]: I1205 21:03:18.997413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.006572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jsjd\" (UniqueName: \"kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.011363 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.015047 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " pod="openstack/nova-api-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.021985 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.043396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfb6r\" (UniqueName: \"kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r\") pod \"nova-metadata-0\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " pod="openstack/nova-metadata-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.056786 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.063638 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.065092 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.068312 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083702 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083778 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083800 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkz6g\" (UniqueName: \"kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083858 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083882 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtfcx\" (UniqueName: \"kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083921 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083937 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jsn\" (UniqueName: \"kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.083990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.082671 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.182376 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185350 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185398 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185430 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkz6g\" (UniqueName: \"kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185467 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtfcx\" (UniqueName: \"kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185512 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185532 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185549 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185571 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jsn\" (UniqueName: \"kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185618 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.185687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.186945 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.188143 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.188950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.190008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.190498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.192165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.194950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.197990 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.202758 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.212484 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jsn\" (UniqueName: \"kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.213494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtfcx\" (UniqueName: \"kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx\") pod \"nova-scheduler-0\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.226439 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.233051 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkz6g\" (UniqueName: \"kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g\") pod \"dnsmasq-dns-5c4475fdfc-f4f8q\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.350998 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.408836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.643226 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.656708 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6pxgq"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.718461 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerStarted","Data":"a509e4c6f45ce3ded62d3a2b7c17c813fdbf932b4032bb1522d92ac0ebe8a563"} Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.724773 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6pxgq" event={"ID":"5f4e99ae-15b6-4c55-924b-84ed464c130c","Type":"ContainerStarted","Data":"18373d4114da4be143c242fe13a65769e6de83c74092675d5bb2b9fd48a40929"} Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.735025 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccdpj"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.736237 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.739053 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.739298 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.754489 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccdpj"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.801374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.801473 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.801542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrv5p\" (UniqueName: \"kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.801572 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.809125 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.903005 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.903124 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrv5p\" (UniqueName: \"kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.903187 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.903231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.908716 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.910148 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.912823 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.922802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrv5p\" (UniqueName: \"kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p\") pod \"nova-cell1-conductor-db-sync-ccdpj\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.931406 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:19 crc kubenswrapper[4747]: W1205 21:03:19.996134 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119c3a74_9260_40e6_930e_bff0b0e11929.slice/crio-9de16d1c3ccc1a8816f3e40a4d413b6d721547712439fb8cac4a136b1d68cbde WatchSource:0}: Error finding container 9de16d1c3ccc1a8816f3e40a4d413b6d721547712439fb8cac4a136b1d68cbde: Status 404 returned error can't find the container with id 9de16d1c3ccc1a8816f3e40a4d413b6d721547712439fb8cac4a136b1d68cbde Dec 05 21:03:19 crc kubenswrapper[4747]: I1205 21:03:19.997066 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.006969 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.023863 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.485832 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccdpj"] Dec 05 21:03:20 crc kubenswrapper[4747]: W1205 21:03:20.488057 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a29110d_8922_4d83_97eb_7c12b0133b8d.slice/crio-b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7 WatchSource:0}: Error finding container b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7: Status 404 returned error can't find the container with id b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7 Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.738831 4747 generic.go:334] "Generic (PLEG): container finished" podID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerID="00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207" exitCode=0 Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.738920 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" event={"ID":"dd1da9c6-85af-4559-a2e8-f52210b7d5c4","Type":"ContainerDied","Data":"00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.738959 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" event={"ID":"dd1da9c6-85af-4559-a2e8-f52210b7d5c4","Type":"ContainerStarted","Data":"8f6077c39476fce72553ab3537821dd3a5d22fe329fa0f87673ff61e8e51a2de"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.746155 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerStarted","Data":"afc2127684f1c3e593f528ecce29ba509af59cacad8e44a7b5c7814510766c20"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.758513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6pxgq" event={"ID":"5f4e99ae-15b6-4c55-924b-84ed464c130c","Type":"ContainerStarted","Data":"da248c5f3b792f07ddf65be7ca9f5b394ff47e04718a63c6e5f4984caebb8672"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.767423 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" event={"ID":"0a29110d-8922-4d83-97eb-7c12b0133b8d","Type":"ContainerStarted","Data":"b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.768978 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9d1684b-fc1a-4706-96ff-e6efec943747","Type":"ContainerStarted","Data":"19fb2a3c0e9d8344f8f51b98b8cbce8840516c03efcf57a91e9129947d2ff60a"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.771565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"119c3a74-9260-40e6-930e-bff0b0e11929","Type":"ContainerStarted","Data":"9de16d1c3ccc1a8816f3e40a4d413b6d721547712439fb8cac4a136b1d68cbde"} Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.788056 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6pxgq" podStartSLOduration=2.78795284 podStartE2EDuration="2.78795284s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:20.785337614 +0000 UTC m=+1271.252645102" watchObservedRunningTime="2025-12-05 21:03:20.78795284 +0000 UTC m=+1271.255260358" Dec 05 21:03:20 crc kubenswrapper[4747]: I1205 21:03:20.819270 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" podStartSLOduration=1.819249665 podStartE2EDuration="1.819249665s" podCreationTimestamp="2025-12-05 21:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:20.805439719 +0000 UTC m=+1271.272747197" watchObservedRunningTime="2025-12-05 21:03:20.819249665 +0000 UTC m=+1271.286557143" Dec 05 21:03:21 crc kubenswrapper[4747]: I1205 21:03:21.784469 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" event={"ID":"0a29110d-8922-4d83-97eb-7c12b0133b8d","Type":"ContainerStarted","Data":"5cd4b38c507e83ae7d637b40662cb44e4fd8285a5ed0dc4193f6d009e999b136"} Dec 05 21:03:21 crc kubenswrapper[4747]: I1205 21:03:21.793074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" event={"ID":"dd1da9c6-85af-4559-a2e8-f52210b7d5c4","Type":"ContainerStarted","Data":"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d"} Dec 05 21:03:21 crc kubenswrapper[4747]: I1205 21:03:21.793148 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:21 crc kubenswrapper[4747]: I1205 21:03:21.828167 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" podStartSLOduration=3.828147755 podStartE2EDuration="3.828147755s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:21.823174121 +0000 UTC m=+1272.290481609" watchObservedRunningTime="2025-12-05 21:03:21.828147755 +0000 UTC m=+1272.295455243" Dec 05 21:03:22 crc kubenswrapper[4747]: I1205 21:03:22.906897 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:22 crc kubenswrapper[4747]: I1205 21:03:22.921235 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.813191 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"119c3a74-9260-40e6-930e-bff0b0e11929","Type":"ContainerStarted","Data":"75a46842458298c8bf94988e0c021d5d5ec42c0d6db67a9616072888144302eb"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.813349 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="119c3a74-9260-40e6-930e-bff0b0e11929" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://75a46842458298c8bf94988e0c021d5d5ec42c0d6db67a9616072888144302eb" gracePeriod=30 Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.816804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerStarted","Data":"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.816838 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerStarted","Data":"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.816958 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-log" containerID="cri-o://c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" gracePeriod=30 Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.817039 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-metadata" containerID="cri-o://f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" gracePeriod=30 Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.821074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerStarted","Data":"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.821128 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerStarted","Data":"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.828324 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9d1684b-fc1a-4706-96ff-e6efec943747","Type":"ContainerStarted","Data":"4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6"} Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.830994 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8791442480000002 podStartE2EDuration="5.830976602s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="2025-12-05 21:03:19.999693204 +0000 UTC m=+1270.467000702" lastFinishedPulling="2025-12-05 21:03:22.951525568 +0000 UTC m=+1273.418833056" observedRunningTime="2025-12-05 21:03:23.828027558 +0000 UTC m=+1274.295335046" watchObservedRunningTime="2025-12-05 21:03:23.830976602 +0000 UTC m=+1274.298284090" Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.858091 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.724394846 podStartE2EDuration="5.858069772s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="2025-12-05 21:03:19.817309119 +0000 UTC m=+1270.284616607" lastFinishedPulling="2025-12-05 21:03:22.950984055 +0000 UTC m=+1273.418291533" observedRunningTime="2025-12-05 21:03:23.846961573 +0000 UTC m=+1274.314269071" watchObservedRunningTime="2025-12-05 21:03:23.858069772 +0000 UTC m=+1274.325377260" Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.876494 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.597189654 podStartE2EDuration="5.876472873s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="2025-12-05 21:03:19.652978396 +0000 UTC m=+1270.120285884" lastFinishedPulling="2025-12-05 21:03:22.932261625 +0000 UTC m=+1273.399569103" observedRunningTime="2025-12-05 21:03:23.865800656 +0000 UTC m=+1274.333108144" watchObservedRunningTime="2025-12-05 21:03:23.876472873 +0000 UTC m=+1274.343780361" Dec 05 21:03:23 crc kubenswrapper[4747]: I1205 21:03:23.886306 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.867783204 podStartE2EDuration="5.88628742s" podCreationTimestamp="2025-12-05 21:03:18 +0000 UTC" firstStartedPulling="2025-12-05 21:03:19.913728128 +0000 UTC m=+1270.381035616" lastFinishedPulling="2025-12-05 21:03:22.932232344 +0000 UTC m=+1273.399539832" observedRunningTime="2025-12-05 21:03:23.879512219 +0000 UTC m=+1274.346819707" watchObservedRunningTime="2025-12-05 21:03:23.88628742 +0000 UTC m=+1274.353594898" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.183336 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.183393 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.226921 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.409938 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.513002 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.513865 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data\") pod \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.513945 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfb6r\" (UniqueName: \"kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r\") pod \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.514032 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle\") pod \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.514088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs\") pod \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\" (UID: \"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd\") " Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.515193 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs" (OuterVolumeSpecName: "logs") pod "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" (UID: "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.610982 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r" (OuterVolumeSpecName: "kube-api-access-zfb6r") pod "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" (UID: "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd"). InnerVolumeSpecName "kube-api-access-zfb6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.616441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" (UID: "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.616631 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.616652 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.616663 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfb6r\" (UniqueName: \"kubernetes.io/projected/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-kube-api-access-zfb6r\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.617230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data" (OuterVolumeSpecName: "config-data") pod "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" (UID: "1ee4e7d2-2011-4ddd-b93a-d456986ff1cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.720054 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840171 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerID="f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" exitCode=0 Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840203 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerID="c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" exitCode=143 Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerDied","Data":"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913"} Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerDied","Data":"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b"} Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ee4e7d2-2011-4ddd-b93a-d456986ff1cd","Type":"ContainerDied","Data":"afc2127684f1c3e593f528ecce29ba509af59cacad8e44a7b5c7814510766c20"} Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840404 4747 scope.go:117] "RemoveContainer" containerID="f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.840722 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.869761 4747 scope.go:117] "RemoveContainer" containerID="c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.881999 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.897971 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.913962 4747 scope.go:117] "RemoveContainer" containerID="f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" Dec 05 21:03:24 crc kubenswrapper[4747]: E1205 21:03:24.914507 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913\": container with ID starting with f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913 not found: ID does not exist" containerID="f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.914537 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913"} err="failed to get container status \"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913\": rpc error: code = NotFound desc = could not find container \"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913\": container with ID starting with f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913 not found: ID does not exist" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.914556 4747 scope.go:117] "RemoveContainer" containerID="c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" Dec 05 21:03:24 crc kubenswrapper[4747]: E1205 21:03:24.915690 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b\": container with ID starting with c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b not found: ID does not exist" containerID="c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.915762 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b"} err="failed to get container status \"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b\": rpc error: code = NotFound desc = could not find container \"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b\": container with ID starting with c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b not found: ID does not exist" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.915797 4747 scope.go:117] "RemoveContainer" containerID="f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.919905 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913"} err="failed to get container status \"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913\": rpc error: code = NotFound desc = could not find container \"f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913\": container with ID starting with f5f86b9c35f1dc5773dde29292675f256e0d0e2ea93f6f30cfe4c98d67cb1913 not found: ID does not exist" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.919953 4747 scope.go:117] "RemoveContainer" containerID="c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.921817 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b"} err="failed to get container status \"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b\": rpc error: code = NotFound desc = could not find container \"c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b\": container with ID starting with c5db5c9dbed1f001e0bedc7a3cf0cf6972e8ec3372a764bd36357e25c29bef6b not found: ID does not exist" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.936429 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:24 crc kubenswrapper[4747]: E1205 21:03:24.936866 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-log" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.936894 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-log" Dec 05 21:03:24 crc kubenswrapper[4747]: E1205 21:03:24.936920 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-metadata" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.936928 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-metadata" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.937176 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-metadata" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.937210 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" containerName="nova-metadata-log" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.938234 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.940435 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.940878 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 21:03:24 crc kubenswrapper[4747]: I1205 21:03:24.986563 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.028168 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.028449 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.028644 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.028737 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.028872 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lgh\" (UniqueName: \"kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131078 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131133 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lgh\" (UniqueName: \"kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.131840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.135877 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.144764 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.153323 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lgh\" (UniqueName: \"kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.164603 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.326302 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.790773 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:25 crc kubenswrapper[4747]: W1205 21:03:25.793262 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ded2cf2_c0f6_47b7_b1bb_7efd9f88438f.slice/crio-e85751698d3ff86f4fc379320e4495255f929275196eb13b5d4cc520f4ec072a WatchSource:0}: Error finding container e85751698d3ff86f4fc379320e4495255f929275196eb13b5d4cc520f4ec072a: Status 404 returned error can't find the container with id e85751698d3ff86f4fc379320e4495255f929275196eb13b5d4cc520f4ec072a Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.858669 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee4e7d2-2011-4ddd-b93a-d456986ff1cd" path="/var/lib/kubelet/pods/1ee4e7d2-2011-4ddd-b93a-d456986ff1cd/volumes" Dec 05 21:03:25 crc kubenswrapper[4747]: I1205 21:03:25.863748 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerStarted","Data":"e85751698d3ff86f4fc379320e4495255f929275196eb13b5d4cc520f4ec072a"} Dec 05 21:03:26 crc kubenswrapper[4747]: I1205 21:03:26.892493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerStarted","Data":"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51"} Dec 05 21:03:26 crc kubenswrapper[4747]: I1205 21:03:26.893355 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerStarted","Data":"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526"} Dec 05 21:03:26 crc kubenswrapper[4747]: I1205 21:03:26.936717 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9366892460000003 podStartE2EDuration="2.936689246s" podCreationTimestamp="2025-12-05 21:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:26.925730051 +0000 UTC m=+1277.393037579" watchObservedRunningTime="2025-12-05 21:03:26.936689246 +0000 UTC m=+1277.403996774" Dec 05 21:03:27 crc kubenswrapper[4747]: I1205 21:03:27.904318 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f4e99ae-15b6-4c55-924b-84ed464c130c" containerID="da248c5f3b792f07ddf65be7ca9f5b394ff47e04718a63c6e5f4984caebb8672" exitCode=0 Dec 05 21:03:27 crc kubenswrapper[4747]: I1205 21:03:27.904405 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6pxgq" event={"ID":"5f4e99ae-15b6-4c55-924b-84ed464c130c","Type":"ContainerDied","Data":"da248c5f3b792f07ddf65be7ca9f5b394ff47e04718a63c6e5f4984caebb8672"} Dec 05 21:03:28 crc kubenswrapper[4747]: I1205 21:03:28.918600 4747 generic.go:334] "Generic (PLEG): container finished" podID="0a29110d-8922-4d83-97eb-7c12b0133b8d" containerID="5cd4b38c507e83ae7d637b40662cb44e4fd8285a5ed0dc4193f6d009e999b136" exitCode=0 Dec 05 21:03:28 crc kubenswrapper[4747]: I1205 21:03:28.918687 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" event={"ID":"0a29110d-8922-4d83-97eb-7c12b0133b8d","Type":"ContainerDied","Data":"5cd4b38c507e83ae7d637b40662cb44e4fd8285a5ed0dc4193f6d009e999b136"} Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.066137 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.066179 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.227085 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.260821 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.352785 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.355773 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.443674 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.443899 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="dnsmasq-dns" containerID="cri-o://3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd" gracePeriod=10 Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.520537 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v8tx\" (UniqueName: \"kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx\") pod \"5f4e99ae-15b6-4c55-924b-84ed464c130c\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.520639 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts\") pod \"5f4e99ae-15b6-4c55-924b-84ed464c130c\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.520696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data\") pod \"5f4e99ae-15b6-4c55-924b-84ed464c130c\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.520744 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle\") pod \"5f4e99ae-15b6-4c55-924b-84ed464c130c\" (UID: \"5f4e99ae-15b6-4c55-924b-84ed464c130c\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.531794 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx" (OuterVolumeSpecName: "kube-api-access-4v8tx") pod "5f4e99ae-15b6-4c55-924b-84ed464c130c" (UID: "5f4e99ae-15b6-4c55-924b-84ed464c130c"). InnerVolumeSpecName "kube-api-access-4v8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.568311 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts" (OuterVolumeSpecName: "scripts") pod "5f4e99ae-15b6-4c55-924b-84ed464c130c" (UID: "5f4e99ae-15b6-4c55-924b-84ed464c130c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.572766 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data" (OuterVolumeSpecName: "config-data") pod "5f4e99ae-15b6-4c55-924b-84ed464c130c" (UID: "5f4e99ae-15b6-4c55-924b-84ed464c130c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.584994 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f4e99ae-15b6-4c55-924b-84ed464c130c" (UID: "5f4e99ae-15b6-4c55-924b-84ed464c130c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.624318 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v8tx\" (UniqueName: \"kubernetes.io/projected/5f4e99ae-15b6-4c55-924b-84ed464c130c-kube-api-access-4v8tx\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.624358 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.624367 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.624375 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e99ae-15b6-4c55-924b-84ed464c130c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:29 crc kubenswrapper[4747]: E1205 21:03:29.678729 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb96dbb40_2f15_49dc_afc0_82c16301d001.slice/crio-conmon-3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb96dbb40_2f15_49dc_afc0_82c16301d001.slice/crio-3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.827275 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.928562 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.928662 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.928725 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.931379 4747 generic.go:334] "Generic (PLEG): container finished" podID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerID="3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd" exitCode=0 Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.931433 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" event={"ID":"b96dbb40-2f15-49dc-afc0-82c16301d001","Type":"ContainerDied","Data":"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd"} Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.931460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" event={"ID":"b96dbb40-2f15-49dc-afc0-82c16301d001","Type":"ContainerDied","Data":"9a5f11613bb63c2b64be019d8de1a3cf9fe38ac202de3dd2cd4ec75cc8d5763c"} Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.931476 4747 scope.go:117] "RemoveContainer" containerID="3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.931615 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-rtlsm" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.932133 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6dzj\" (UniqueName: \"kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.932227 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.932266 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config\") pod \"b96dbb40-2f15-49dc-afc0-82c16301d001\" (UID: \"b96dbb40-2f15-49dc-afc0-82c16301d001\") " Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.941326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6pxgq" event={"ID":"5f4e99ae-15b6-4c55-924b-84ed464c130c","Type":"ContainerDied","Data":"18373d4114da4be143c242fe13a65769e6de83c74092675d5bb2b9fd48a40929"} Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.941376 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18373d4114da4be143c242fe13a65769e6de83c74092675d5bb2b9fd48a40929" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.941419 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6pxgq" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.942507 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj" (OuterVolumeSpecName: "kube-api-access-q6dzj") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "kube-api-access-q6dzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.947942 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6dzj\" (UniqueName: \"kubernetes.io/projected/b96dbb40-2f15-49dc-afc0-82c16301d001-kube-api-access-q6dzj\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.980359 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.993225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config" (OuterVolumeSpecName: "config") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:03:29 crc kubenswrapper[4747]: I1205 21:03:29.999177 4747 scope.go:117] "RemoveContainer" containerID="549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.003132 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.010795 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.018567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.021245 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b96dbb40-2f15-49dc-afc0-82c16301d001" (UID: "b96dbb40-2f15-49dc-afc0-82c16301d001"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.027749 4747 scope.go:117] "RemoveContainer" containerID="3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd" Dec 05 21:03:30 crc kubenswrapper[4747]: E1205 21:03:30.029394 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd\": container with ID starting with 3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd not found: ID does not exist" containerID="3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.029421 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd"} err="failed to get container status \"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd\": rpc error: code = NotFound desc = could not find container \"3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd\": container with ID starting with 3d5965befac25c9837670863a3890d1db8b15184f520c9da7c335a08876ca1dd not found: ID does not exist" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.029442 4747 scope.go:117] "RemoveContainer" containerID="549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82" Dec 05 21:03:30 crc kubenswrapper[4747]: E1205 21:03:30.029999 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82\": container with ID starting with 549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82 not found: ID does not exist" containerID="549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.030020 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82"} err="failed to get container status \"549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82\": rpc error: code = NotFound desc = could not find container \"549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82\": container with ID starting with 549bbca35b68b669a2a63c7af2ef32e95b5355f7d9072b9c515f186a60923d82 not found: ID does not exist" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.049561 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.049597 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.049606 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.049616 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.049625 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96dbb40-2f15-49dc-afc0-82c16301d001-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.107789 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.107987 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-log" containerID="cri-o://52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0" gracePeriod=30 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.108516 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-api" containerID="cri-o://833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0" gracePeriod=30 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.116804 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.116904 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.130130 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.130327 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-log" containerID="cri-o://a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" gracePeriod=30 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.130824 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-metadata" containerID="cri-o://7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" gracePeriod=30 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.327244 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.327635 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.410558 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.456408 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data\") pod \"0a29110d-8922-4d83-97eb-7c12b0133b8d\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.456483 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle\") pod \"0a29110d-8922-4d83-97eb-7c12b0133b8d\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.456559 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts\") pod \"0a29110d-8922-4d83-97eb-7c12b0133b8d\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.456662 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrv5p\" (UniqueName: \"kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p\") pod \"0a29110d-8922-4d83-97eb-7c12b0133b8d\" (UID: \"0a29110d-8922-4d83-97eb-7c12b0133b8d\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.459375 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.461945 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts" (OuterVolumeSpecName: "scripts") pod "0a29110d-8922-4d83-97eb-7c12b0133b8d" (UID: "0a29110d-8922-4d83-97eb-7c12b0133b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.468964 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-rtlsm"] Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.479213 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p" (OuterVolumeSpecName: "kube-api-access-rrv5p") pod "0a29110d-8922-4d83-97eb-7c12b0133b8d" (UID: "0a29110d-8922-4d83-97eb-7c12b0133b8d"). InnerVolumeSpecName "kube-api-access-rrv5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.493869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data" (OuterVolumeSpecName: "config-data") pod "0a29110d-8922-4d83-97eb-7c12b0133b8d" (UID: "0a29110d-8922-4d83-97eb-7c12b0133b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.570184 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.570220 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.570232 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrv5p\" (UniqueName: \"kubernetes.io/projected/0a29110d-8922-4d83-97eb-7c12b0133b8d-kube-api-access-rrv5p\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.571715 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.574826 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a29110d-8922-4d83-97eb-7c12b0133b8d" (UID: "0a29110d-8922-4d83-97eb-7c12b0133b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.666840 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.671671 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a29110d-8922-4d83-97eb-7c12b0133b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773205 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle\") pod \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773267 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data\") pod \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773387 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs\") pod \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773410 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lgh\" (UniqueName: \"kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh\") pod \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773465 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs\") pod \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\" (UID: \"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f\") " Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.773760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs" (OuterVolumeSpecName: "logs") pod "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" (UID: "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.774013 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.777433 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh" (OuterVolumeSpecName: "kube-api-access-x4lgh") pod "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" (UID: "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f"). InnerVolumeSpecName "kube-api-access-x4lgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.798849 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" (UID: "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.816534 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data" (OuterVolumeSpecName: "config-data") pod "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" (UID: "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.831726 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" (UID: "3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.875292 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lgh\" (UniqueName: \"kubernetes.io/projected/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-kube-api-access-x4lgh\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.875575 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.875669 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.875745 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.955503 4747 generic.go:334] "Generic (PLEG): container finished" podID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerID="7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" exitCode=0 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.956955 4747 generic.go:334] "Generic (PLEG): container finished" podID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerID="a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" exitCode=143 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.955760 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.955665 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerDied","Data":"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51"} Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.957385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerDied","Data":"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526"} Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.957408 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f","Type":"ContainerDied","Data":"e85751698d3ff86f4fc379320e4495255f929275196eb13b5d4cc520f4ec072a"} Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.957430 4747 scope.go:117] "RemoveContainer" containerID="7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.959769 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" event={"ID":"0a29110d-8922-4d83-97eb-7c12b0133b8d","Type":"ContainerDied","Data":"b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7"} Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.959809 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f107863f1fdd6ed678c4cf393158a0dcc88133ce4afaf2f8092dc61788f7e7" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.959916 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccdpj" Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.974403 4747 generic.go:334] "Generic (PLEG): container finished" podID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerID="52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0" exitCode=143 Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.975352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerDied","Data":"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0"} Dec 05 21:03:30 crc kubenswrapper[4747]: I1205 21:03:30.986399 4747 scope.go:117] "RemoveContainer" containerID="a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.020272 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.021025 4747 scope.go:117] "RemoveContainer" containerID="7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.023662 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51\": container with ID starting with 7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51 not found: ID does not exist" containerID="7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.023710 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51"} err="failed to get container status \"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51\": rpc error: code = NotFound desc = could not find container \"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51\": container with ID starting with 7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51 not found: ID does not exist" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.023737 4747 scope.go:117] "RemoveContainer" containerID="a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.024163 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526\": container with ID starting with a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526 not found: ID does not exist" containerID="a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.024262 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526"} err="failed to get container status \"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526\": rpc error: code = NotFound desc = could not find container \"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526\": container with ID starting with a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526 not found: ID does not exist" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.024359 4747 scope.go:117] "RemoveContainer" containerID="7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.024778 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51"} err="failed to get container status \"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51\": rpc error: code = NotFound desc = could not find container \"7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51\": container with ID starting with 7ca5a0fbf7e101d1b750ae768c32c8bc925798d7c33d222aef315656dbfe8c51 not found: ID does not exist" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.024868 4747 scope.go:117] "RemoveContainer" containerID="a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.025327 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526"} err="failed to get container status \"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526\": rpc error: code = NotFound desc = could not find container \"a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526\": container with ID starting with a7f6ef95ca472f8e93dcf9ed5ec578b6926e12945132db6a33d079c9cf15d526 not found: ID does not exist" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.039849 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.049751 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.050368 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-metadata" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.050444 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-metadata" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.050535 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4e99ae-15b6-4c55-924b-84ed464c130c" containerName="nova-manage" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.050604 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4e99ae-15b6-4c55-924b-84ed464c130c" containerName="nova-manage" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.050671 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-log" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.050732 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-log" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.050803 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a29110d-8922-4d83-97eb-7c12b0133b8d" containerName="nova-cell1-conductor-db-sync" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.050855 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a29110d-8922-4d83-97eb-7c12b0133b8d" containerName="nova-cell1-conductor-db-sync" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.050940 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="dnsmasq-dns" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.050997 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="dnsmasq-dns" Dec 05 21:03:31 crc kubenswrapper[4747]: E1205 21:03:31.051059 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="init" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051121 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="init" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051359 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" containerName="dnsmasq-dns" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051423 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-log" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051483 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" containerName="nova-metadata-metadata" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051558 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4e99ae-15b6-4c55-924b-84ed464c130c" containerName="nova-manage" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.051646 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a29110d-8922-4d83-97eb-7c12b0133b8d" containerName="nova-cell1-conductor-db-sync" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.052710 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.057872 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.057874 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.060449 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.085349 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.086508 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.090158 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.116073 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186190 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186586 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186617 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186666 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186714 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgff4\" (UniqueName: \"kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186785 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186867 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgff4\" (UniqueName: \"kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.186920 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgff4\" (UniqueName: \"kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288441 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288498 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgff4\" (UniqueName: \"kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288521 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288643 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.288664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.290573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.293315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.293966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.294088 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.294163 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.298132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.307048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgff4\" (UniqueName: \"kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4\") pod \"nova-metadata-0\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.307052 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgff4\" (UniqueName: \"kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4\") pod \"nova-cell1-conductor-0\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.371668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.421720 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.849902 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f" path="/var/lib/kubelet/pods/3ded2cf2-c0f6-47b7-b1bb-7efd9f88438f/volumes" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.850481 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96dbb40-2f15-49dc-afc0-82c16301d001" path="/var/lib/kubelet/pods/b96dbb40-2f15-49dc-afc0-82c16301d001/volumes" Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.919415 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.928377 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:03:31 crc kubenswrapper[4747]: W1205 21:03:31.931946 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee758f70_0c00_471e_85bf_2d4a96646d15.slice/crio-1ab9cd12be84961b02a69bd0db8327f1366119f08032f65fa294552f7ceb9d94 WatchSource:0}: Error finding container 1ab9cd12be84961b02a69bd0db8327f1366119f08032f65fa294552f7ceb9d94: Status 404 returned error can't find the container with id 1ab9cd12be84961b02a69bd0db8327f1366119f08032f65fa294552f7ceb9d94 Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.991682 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerStarted","Data":"adb12b3f142882492ef7d6254b98a09d631553794082f9009e504b1076967ac2"} Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.997229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ee758f70-0c00-471e-85bf-2d4a96646d15","Type":"ContainerStarted","Data":"1ab9cd12be84961b02a69bd0db8327f1366119f08032f65fa294552f7ceb9d94"} Dec 05 21:03:31 crc kubenswrapper[4747]: I1205 21:03:31.997343 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerName="nova-scheduler-scheduler" containerID="cri-o://4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" gracePeriod=30 Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.009688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerStarted","Data":"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad"} Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.011273 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerStarted","Data":"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a"} Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.013323 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ee758f70-0c00-471e-85bf-2d4a96646d15","Type":"ContainerStarted","Data":"bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e"} Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.013512 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.054475 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.054455305 podStartE2EDuration="2.054455305s" podCreationTimestamp="2025-12-05 21:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:33.053736097 +0000 UTC m=+1283.521043585" watchObservedRunningTime="2025-12-05 21:03:33.054455305 +0000 UTC m=+1283.521762793" Dec 05 21:03:33 crc kubenswrapper[4747]: I1205 21:03:33.055263 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.055257365 podStartE2EDuration="2.055257365s" podCreationTimestamp="2025-12-05 21:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:33.037281264 +0000 UTC m=+1283.504588752" watchObservedRunningTime="2025-12-05 21:03:33.055257365 +0000 UTC m=+1283.522564843" Dec 05 21:03:34 crc kubenswrapper[4747]: E1205 21:03:34.230144 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:03:34 crc kubenswrapper[4747]: E1205 21:03:34.234449 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:03:34 crc kubenswrapper[4747]: E1205 21:03:34.236760 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:03:34 crc kubenswrapper[4747]: E1205 21:03:34.236949 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerName="nova-scheduler-scheduler" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.031430 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerID="4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" exitCode=0 Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.031521 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9d1684b-fc1a-4706-96ff-e6efec943747","Type":"ContainerDied","Data":"4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6"} Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.031929 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9d1684b-fc1a-4706-96ff-e6efec943747","Type":"ContainerDied","Data":"19fb2a3c0e9d8344f8f51b98b8cbce8840516c03efcf57a91e9129947d2ff60a"} Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.031943 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19fb2a3c0e9d8344f8f51b98b8cbce8840516c03efcf57a91e9129947d2ff60a" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.075169 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.164347 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data\") pod \"b9d1684b-fc1a-4706-96ff-e6efec943747\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.164421 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle\") pod \"b9d1684b-fc1a-4706-96ff-e6efec943747\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.164571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtfcx\" (UniqueName: \"kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx\") pod \"b9d1684b-fc1a-4706-96ff-e6efec943747\" (UID: \"b9d1684b-fc1a-4706-96ff-e6efec943747\") " Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.169101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx" (OuterVolumeSpecName: "kube-api-access-qtfcx") pod "b9d1684b-fc1a-4706-96ff-e6efec943747" (UID: "b9d1684b-fc1a-4706-96ff-e6efec943747"). InnerVolumeSpecName "kube-api-access-qtfcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.193059 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9d1684b-fc1a-4706-96ff-e6efec943747" (UID: "b9d1684b-fc1a-4706-96ff-e6efec943747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.199418 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data" (OuterVolumeSpecName: "config-data") pod "b9d1684b-fc1a-4706-96ff-e6efec943747" (UID: "b9d1684b-fc1a-4706-96ff-e6efec943747"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.266696 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtfcx\" (UniqueName: \"kubernetes.io/projected/b9d1684b-fc1a-4706-96ff-e6efec943747-kube-api-access-qtfcx\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.266727 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.266736 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d1684b-fc1a-4706-96ff-e6efec943747-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:35 crc kubenswrapper[4747]: I1205 21:03:35.963861 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049344 4747 generic.go:334] "Generic (PLEG): container finished" podID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerID="833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0" exitCode=0 Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049386 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerDied","Data":"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0"} Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049424 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049432 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e9f058a-5697-4b8d-b1fa-43891dae8db9","Type":"ContainerDied","Data":"a509e4c6f45ce3ded62d3a2b7c17c813fdbf932b4032bb1522d92ac0ebe8a563"} Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049454 4747 scope.go:117] "RemoveContainer" containerID="833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.049456 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.082578 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs\") pod \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.082972 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data\") pod \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.083151 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jsjd\" (UniqueName: \"kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd\") pod \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.083339 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle\") pod \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\" (UID: \"3e9f058a-5697-4b8d-b1fa-43891dae8db9\") " Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.083399 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs" (OuterVolumeSpecName: "logs") pod "3e9f058a-5697-4b8d-b1fa-43891dae8db9" (UID: "3e9f058a-5697-4b8d-b1fa-43891dae8db9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.089145 4747 scope.go:117] "RemoveContainer" containerID="52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.095837 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e9f058a-5697-4b8d-b1fa-43891dae8db9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.097247 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd" (OuterVolumeSpecName: "kube-api-access-9jsjd") pod "3e9f058a-5697-4b8d-b1fa-43891dae8db9" (UID: "3e9f058a-5697-4b8d-b1fa-43891dae8db9"). InnerVolumeSpecName "kube-api-access-9jsjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.098107 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.114635 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.127332 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data" (OuterVolumeSpecName: "config-data") pod "3e9f058a-5697-4b8d-b1fa-43891dae8db9" (UID: "3e9f058a-5697-4b8d-b1fa-43891dae8db9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.133634 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: E1205 21:03:36.134253 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-api" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134283 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-api" Dec 05 21:03:36 crc kubenswrapper[4747]: E1205 21:03:36.134320 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerName="nova-scheduler-scheduler" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134331 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerName="nova-scheduler-scheduler" Dec 05 21:03:36 crc kubenswrapper[4747]: E1205 21:03:36.134351 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-log" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134362 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-log" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134710 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-log" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134741 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" containerName="nova-scheduler-scheduler" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.134770 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" containerName="nova-api-api" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.135765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.138423 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.141047 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e9f058a-5697-4b8d-b1fa-43891dae8db9" (UID: "3e9f058a-5697-4b8d-b1fa-43891dae8db9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.142160 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.198239 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.198270 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jsjd\" (UniqueName: \"kubernetes.io/projected/3e9f058a-5697-4b8d-b1fa-43891dae8db9-kube-api-access-9jsjd\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.198285 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9f058a-5697-4b8d-b1fa-43891dae8db9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.200671 4747 scope.go:117] "RemoveContainer" containerID="833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0" Dec 05 21:03:36 crc kubenswrapper[4747]: E1205 21:03:36.201085 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0\": container with ID starting with 833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0 not found: ID does not exist" containerID="833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.201122 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0"} err="failed to get container status \"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0\": rpc error: code = NotFound desc = could not find container \"833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0\": container with ID starting with 833985c6d01baf79d57df06c936ee1677a721c5fee0ad59f103673a6c12943f0 not found: ID does not exist" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.201146 4747 scope.go:117] "RemoveContainer" containerID="52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0" Dec 05 21:03:36 crc kubenswrapper[4747]: E1205 21:03:36.201536 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0\": container with ID starting with 52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0 not found: ID does not exist" containerID="52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.201563 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0"} err="failed to get container status \"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0\": rpc error: code = NotFound desc = could not find container \"52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0\": container with ID starting with 52fd2c566737d13acb4f89e34979f8b7605320b2dd6aafdd301c677ba24e34d0 not found: ID does not exist" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.300930 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622s8\" (UniqueName: \"kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.301198 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.301340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.372521 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.372659 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.397864 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.404061 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.404222 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.404353 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622s8\" (UniqueName: \"kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.410721 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.414255 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.422057 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.433332 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.456688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.457044 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.484862 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.488519 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622s8\" (UniqueName: \"kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8\") pod \"nova-scheduler-0\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.499093 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.611898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.611944 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdn9z\" (UniqueName: \"kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.612022 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.612077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.715183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.715927 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.717020 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdn9z\" (UniqueName: \"kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.717100 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.719466 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.722700 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.727992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.735986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdn9z\" (UniqueName: \"kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z\") pod \"nova-api-0\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.812115 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:03:36 crc kubenswrapper[4747]: I1205 21:03:36.950347 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:03:36 crc kubenswrapper[4747]: W1205 21:03:36.959708 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f2e4720_4c72_4ed6_a648_2d59f09d3137.slice/crio-e385e14d19b70518d6fc8f31c2c66da16d4457342929167e57042c948cac47aa WatchSource:0}: Error finding container e385e14d19b70518d6fc8f31c2c66da16d4457342929167e57042c948cac47aa: Status 404 returned error can't find the container with id e385e14d19b70518d6fc8f31c2c66da16d4457342929167e57042c948cac47aa Dec 05 21:03:37 crc kubenswrapper[4747]: I1205 21:03:37.062684 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7f2e4720-4c72-4ed6-a648-2d59f09d3137","Type":"ContainerStarted","Data":"e385e14d19b70518d6fc8f31c2c66da16d4457342929167e57042c948cac47aa"} Dec 05 21:03:37 crc kubenswrapper[4747]: I1205 21:03:37.272068 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:03:37 crc kubenswrapper[4747]: W1205 21:03:37.274734 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea339e9_bd2c_405e_8d52_49238be6a72b.slice/crio-2793580de01cacd9543fb213d66f1a90b0be3b4909d33bb2a486302dedc605b1 WatchSource:0}: Error finding container 2793580de01cacd9543fb213d66f1a90b0be3b4909d33bb2a486302dedc605b1: Status 404 returned error can't find the container with id 2793580de01cacd9543fb213d66f1a90b0be3b4909d33bb2a486302dedc605b1 Dec 05 21:03:37 crc kubenswrapper[4747]: I1205 21:03:37.851672 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9f058a-5697-4b8d-b1fa-43891dae8db9" path="/var/lib/kubelet/pods/3e9f058a-5697-4b8d-b1fa-43891dae8db9/volumes" Dec 05 21:03:37 crc kubenswrapper[4747]: I1205 21:03:37.852672 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d1684b-fc1a-4706-96ff-e6efec943747" path="/var/lib/kubelet/pods/b9d1684b-fc1a-4706-96ff-e6efec943747/volumes" Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.078833 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7f2e4720-4c72-4ed6-a648-2d59f09d3137","Type":"ContainerStarted","Data":"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a"} Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.082274 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerStarted","Data":"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075"} Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.082429 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerStarted","Data":"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40"} Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.082492 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerStarted","Data":"2793580de01cacd9543fb213d66f1a90b0be3b4909d33bb2a486302dedc605b1"} Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.102843 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.102820006 podStartE2EDuration="2.102820006s" podCreationTimestamp="2025-12-05 21:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:38.097941114 +0000 UTC m=+1288.565248632" watchObservedRunningTime="2025-12-05 21:03:38.102820006 +0000 UTC m=+1288.570127494" Dec 05 21:03:38 crc kubenswrapper[4747]: I1205 21:03:38.125261 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.125230548 podStartE2EDuration="2.125230548s" podCreationTimestamp="2025-12-05 21:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:38.123257829 +0000 UTC m=+1288.590565357" watchObservedRunningTime="2025-12-05 21:03:38.125230548 +0000 UTC m=+1288.592538076" Dec 05 21:03:41 crc kubenswrapper[4747]: I1205 21:03:41.372318 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 21:03:41 crc kubenswrapper[4747]: I1205 21:03:41.372965 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 21:03:41 crc kubenswrapper[4747]: I1205 21:03:41.467802 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 21:03:41 crc kubenswrapper[4747]: I1205 21:03:41.499909 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 21:03:42 crc kubenswrapper[4747]: I1205 21:03:42.393809 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:42 crc kubenswrapper[4747]: I1205 21:03:42.393814 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:46 crc kubenswrapper[4747]: I1205 21:03:46.499862 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 21:03:46 crc kubenswrapper[4747]: I1205 21:03:46.578245 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 21:03:46 crc kubenswrapper[4747]: I1205 21:03:46.813344 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:03:46 crc kubenswrapper[4747]: I1205 21:03:46.813412 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:03:47 crc kubenswrapper[4747]: I1205 21:03:47.224303 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 21:03:47 crc kubenswrapper[4747]: I1205 21:03:47.895809 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:47 crc kubenswrapper[4747]: I1205 21:03:47.895824 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 21:03:51 crc kubenswrapper[4747]: I1205 21:03:51.379627 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 21:03:51 crc kubenswrapper[4747]: I1205 21:03:51.379816 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 21:03:51 crc kubenswrapper[4747]: I1205 21:03:51.386473 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 21:03:51 crc kubenswrapper[4747]: I1205 21:03:51.388215 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 21:03:54 crc kubenswrapper[4747]: I1205 21:03:54.252937 4747 generic.go:334] "Generic (PLEG): container finished" podID="119c3a74-9260-40e6-930e-bff0b0e11929" containerID="75a46842458298c8bf94988e0c021d5d5ec42c0d6db67a9616072888144302eb" exitCode=137 Dec 05 21:03:54 crc kubenswrapper[4747]: I1205 21:03:54.253052 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"119c3a74-9260-40e6-930e-bff0b0e11929","Type":"ContainerDied","Data":"75a46842458298c8bf94988e0c021d5d5ec42c0d6db67a9616072888144302eb"} Dec 05 21:03:54 crc kubenswrapper[4747]: I1205 21:03:54.812192 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.006981 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9jsn\" (UniqueName: \"kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn\") pod \"119c3a74-9260-40e6-930e-bff0b0e11929\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.007146 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle\") pod \"119c3a74-9260-40e6-930e-bff0b0e11929\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.007271 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data\") pod \"119c3a74-9260-40e6-930e-bff0b0e11929\" (UID: \"119c3a74-9260-40e6-930e-bff0b0e11929\") " Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.013248 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn" (OuterVolumeSpecName: "kube-api-access-l9jsn") pod "119c3a74-9260-40e6-930e-bff0b0e11929" (UID: "119c3a74-9260-40e6-930e-bff0b0e11929"). InnerVolumeSpecName "kube-api-access-l9jsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.039987 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "119c3a74-9260-40e6-930e-bff0b0e11929" (UID: "119c3a74-9260-40e6-930e-bff0b0e11929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.044842 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data" (OuterVolumeSpecName: "config-data") pod "119c3a74-9260-40e6-930e-bff0b0e11929" (UID: "119c3a74-9260-40e6-930e-bff0b0e11929"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.110974 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9jsn\" (UniqueName: \"kubernetes.io/projected/119c3a74-9260-40e6-930e-bff0b0e11929-kube-api-access-l9jsn\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.111057 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.111089 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119c3a74-9260-40e6-930e-bff0b0e11929-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.266945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"119c3a74-9260-40e6-930e-bff0b0e11929","Type":"ContainerDied","Data":"9de16d1c3ccc1a8816f3e40a4d413b6d721547712439fb8cac4a136b1d68cbde"} Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.267009 4747 scope.go:117] "RemoveContainer" containerID="75a46842458298c8bf94988e0c021d5d5ec42c0d6db67a9616072888144302eb" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.267027 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.345554 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.358079 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.368890 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:55 crc kubenswrapper[4747]: E1205 21:03:55.369468 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119c3a74-9260-40e6-930e-bff0b0e11929" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.369494 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="119c3a74-9260-40e6-930e-bff0b0e11929" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.369828 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="119c3a74-9260-40e6-930e-bff0b0e11929" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.371015 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.374337 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.374575 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.374747 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.382378 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.519646 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszzt\" (UniqueName: \"kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.519763 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.519783 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.519895 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.519931 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.622040 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.622123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.622166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszzt\" (UniqueName: \"kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.622315 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.622348 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.627375 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.627954 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.628436 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.629822 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.657126 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszzt\" (UniqueName: \"kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt\") pod \"nova-cell1-novncproxy-0\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.698801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:03:55 crc kubenswrapper[4747]: I1205 21:03:55.853947 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119c3a74-9260-40e6-930e-bff0b0e11929" path="/var/lib/kubelet/pods/119c3a74-9260-40e6-930e-bff0b0e11929/volumes" Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.231567 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.277460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"82c51dab-4948-4ca3-94ba-c25cb3a4e280","Type":"ContainerStarted","Data":"5aec99b6d9d29c377e6e7a9c1dc6f8b9354c00cfae8bf46f28d427c0337cae99"} Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.818041 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.818689 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.822730 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 21:03:56 crc kubenswrapper[4747]: I1205 21:03:56.823891 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.292531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"82c51dab-4948-4ca3-94ba-c25cb3a4e280","Type":"ContainerStarted","Data":"f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799"} Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.292609 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.297451 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.335643 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.33556018 podStartE2EDuration="2.33556018s" podCreationTimestamp="2025-12-05 21:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:03:57.326130667 +0000 UTC m=+1307.793438165" watchObservedRunningTime="2025-12-05 21:03:57.33556018 +0000 UTC m=+1307.802867678" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.527574 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.529168 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.542314 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661048 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661108 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmrp\" (UniqueName: \"kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661145 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661381 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.661533 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmrp\" (UniqueName: \"kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763273 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763406 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763453 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.763537 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.764425 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.764435 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.764534 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.764733 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.764815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.793532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmrp\" (UniqueName: \"kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp\") pod \"dnsmasq-dns-5c9cbcb645-8wqh8\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:57 crc kubenswrapper[4747]: I1205 21:03:57.852387 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:03:58 crc kubenswrapper[4747]: I1205 21:03:58.318033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:03:58 crc kubenswrapper[4747]: W1205 21:03:58.318751 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf40d0cc_0d14_4c55_986d_2809df27c4fd.slice/crio-4d28b6de58dd774c339108f10fcb306f62a9ef3ba472c445b80fa39ae8796875 WatchSource:0}: Error finding container 4d28b6de58dd774c339108f10fcb306f62a9ef3ba472c445b80fa39ae8796875: Status 404 returned error can't find the container with id 4d28b6de58dd774c339108f10fcb306f62a9ef3ba472c445b80fa39ae8796875 Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.309467 4747 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerID="9b9b6204257884d5f197a391b88fab7960fa28bbc7447aca2b287e4eff66de65" exitCode=0 Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.309557 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" event={"ID":"bf40d0cc-0d14-4c55-986d-2809df27c4fd","Type":"ContainerDied","Data":"9b9b6204257884d5f197a391b88fab7960fa28bbc7447aca2b287e4eff66de65"} Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.309748 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" event={"ID":"bf40d0cc-0d14-4c55-986d-2809df27c4fd","Type":"ContainerStarted","Data":"4d28b6de58dd774c339108f10fcb306f62a9ef3ba472c445b80fa39ae8796875"} Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.667509 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.670422 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-central-agent" containerID="cri-o://5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced" gracePeriod=30 Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.670480 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="proxy-httpd" containerID="cri-o://dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861" gracePeriod=30 Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.670509 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="sg-core" containerID="cri-o://c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413" gracePeriod=30 Dec 05 21:03:59 crc kubenswrapper[4747]: I1205 21:03:59.670539 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-notification-agent" containerID="cri-o://84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05" gracePeriod=30 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.336664 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.346379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" event={"ID":"bf40d0cc-0d14-4c55-986d-2809df27c4fd","Type":"ContainerStarted","Data":"ebc61242a667918cfc3cf1987e66b55a1c850c340da8529513e354533f3f843e"} Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.346872 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362085 4747 generic.go:334] "Generic (PLEG): container finished" podID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerID="dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861" exitCode=0 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362119 4747 generic.go:334] "Generic (PLEG): container finished" podID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerID="c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413" exitCode=2 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362127 4747 generic.go:334] "Generic (PLEG): container finished" podID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerID="5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced" exitCode=0 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362297 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-log" containerID="cri-o://4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40" gracePeriod=30 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362830 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerDied","Data":"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861"} Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362860 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerDied","Data":"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413"} Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerDied","Data":"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced"} Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.362928 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-api" containerID="cri-o://36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075" gracePeriod=30 Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.387857 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" podStartSLOduration=3.387839433 podStartE2EDuration="3.387839433s" podCreationTimestamp="2025-12-05 21:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:00.375329034 +0000 UTC m=+1310.842636562" watchObservedRunningTime="2025-12-05 21:04:00.387839433 +0000 UTC m=+1310.855146921" Dec 05 21:04:00 crc kubenswrapper[4747]: I1205 21:04:00.699428 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:04:01 crc kubenswrapper[4747]: I1205 21:04:01.374146 4747 generic.go:334] "Generic (PLEG): container finished" podID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerID="4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40" exitCode=143 Dec 05 21:04:01 crc kubenswrapper[4747]: I1205 21:04:01.374239 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerDied","Data":"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40"} Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.881039 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965353 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965533 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965640 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-468jg\" (UniqueName: \"kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965666 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965705 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965767 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965793 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.965831 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts\") pod \"4b422a50-036b-4eb4-80ff-51abb27f06a1\" (UID: \"4b422a50-036b-4eb4-80ff-51abb27f06a1\") " Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.966407 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.966497 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.966796 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.966822 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b422a50-036b-4eb4-80ff-51abb27f06a1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.973922 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts" (OuterVolumeSpecName: "scripts") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:02 crc kubenswrapper[4747]: I1205 21:04:02.973991 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg" (OuterVolumeSpecName: "kube-api-access-468jg") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "kube-api-access-468jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.000886 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.035021 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.068203 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.068752 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-468jg\" (UniqueName: \"kubernetes.io/projected/4b422a50-036b-4eb4-80ff-51abb27f06a1-kube-api-access-468jg\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.068857 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.068929 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.068986 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.069044 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.078662 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data" (OuterVolumeSpecName: "config-data") pod "4b422a50-036b-4eb4-80ff-51abb27f06a1" (UID: "4b422a50-036b-4eb4-80ff-51abb27f06a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.170795 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b422a50-036b-4eb4-80ff-51abb27f06a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.403388 4747 generic.go:334] "Generic (PLEG): container finished" podID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerID="84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05" exitCode=0 Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.403437 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerDied","Data":"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05"} Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.403491 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b422a50-036b-4eb4-80ff-51abb27f06a1","Type":"ContainerDied","Data":"6a2139dae433917b57bc67d84eb6c878304e1e8f7f1b9f236a83d01cacd1b7a0"} Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.403513 4747 scope.go:117] "RemoveContainer" containerID="dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.403525 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.431176 4747 scope.go:117] "RemoveContainer" containerID="c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.458120 4747 scope.go:117] "RemoveContainer" containerID="84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.471779 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.486048 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.495339 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.495751 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="proxy-httpd" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.495762 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="proxy-httpd" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.495791 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-central-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.495797 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-central-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.495808 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="sg-core" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.495814 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="sg-core" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.495832 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-notification-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.495837 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-notification-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.496020 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-notification-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.496033 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="sg-core" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.496053 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="proxy-httpd" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.496059 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" containerName="ceilometer-central-agent" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.498229 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.507013 4747 scope.go:117] "RemoveContainer" containerID="5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.507255 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.507448 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.507947 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.519294 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.578881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.578951 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklfk\" (UniqueName: \"kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.578988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.579052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.579075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.579094 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.579202 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.579234 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.649914 4747 scope.go:117] "RemoveContainer" containerID="dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.650378 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861\": container with ID starting with dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861 not found: ID does not exist" containerID="dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.650420 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861"} err="failed to get container status \"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861\": rpc error: code = NotFound desc = could not find container \"dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861\": container with ID starting with dc7cef177ff8bdfdbb377d9cfdf75ad442accbc4d74c20dc93afc3ca3d5ed861 not found: ID does not exist" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.650440 4747 scope.go:117] "RemoveContainer" containerID="c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.650828 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413\": container with ID starting with c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413 not found: ID does not exist" containerID="c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.650870 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413"} err="failed to get container status \"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413\": rpc error: code = NotFound desc = could not find container \"c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413\": container with ID starting with c47550d896276695d00dc5c8500b283c78da471e0b43b0604139cded2dc7d413 not found: ID does not exist" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.650904 4747 scope.go:117] "RemoveContainer" containerID="84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.651176 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05\": container with ID starting with 84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05 not found: ID does not exist" containerID="84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.651217 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05"} err="failed to get container status \"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05\": rpc error: code = NotFound desc = could not find container \"84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05\": container with ID starting with 84940231c8572658c1c2581286a966a00ff886abac689f581ac998df34d5cb05 not found: ID does not exist" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.651242 4747 scope.go:117] "RemoveContainer" containerID="5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced" Dec 05 21:04:03 crc kubenswrapper[4747]: E1205 21:04:03.651499 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced\": container with ID starting with 5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced not found: ID does not exist" containerID="5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.651515 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced"} err="failed to get container status \"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced\": rpc error: code = NotFound desc = could not find container \"5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced\": container with ID starting with 5ee0d27883cfbcca5a7c8a734f3caecd08245261d8b1edc14cb7b4e7d8370ced not found: ID does not exist" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.680833 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.680898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.680924 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.680947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.681646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.682131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.682173 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.682271 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.682320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklfk\" (UniqueName: \"kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.683038 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.686784 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.687465 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.694624 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.696782 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.702455 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklfk\" (UniqueName: \"kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.704309 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.852388 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b422a50-036b-4eb4-80ff-51abb27f06a1" path="/var/lib/kubelet/pods/4b422a50-036b-4eb4-80ff-51abb27f06a1/volumes" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.930993 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.943521 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.988833 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data\") pod \"dea339e9-bd2c-405e-8d52-49238be6a72b\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.989631 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs\") pod \"dea339e9-bd2c-405e-8d52-49238be6a72b\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.989780 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdn9z\" (UniqueName: \"kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z\") pod \"dea339e9-bd2c-405e-8d52-49238be6a72b\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.989894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle\") pod \"dea339e9-bd2c-405e-8d52-49238be6a72b\" (UID: \"dea339e9-bd2c-405e-8d52-49238be6a72b\") " Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.990204 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs" (OuterVolumeSpecName: "logs") pod "dea339e9-bd2c-405e-8d52-49238be6a72b" (UID: "dea339e9-bd2c-405e-8d52-49238be6a72b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.990518 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea339e9-bd2c-405e-8d52-49238be6a72b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:03 crc kubenswrapper[4747]: I1205 21:04:03.995615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z" (OuterVolumeSpecName: "kube-api-access-pdn9z") pod "dea339e9-bd2c-405e-8d52-49238be6a72b" (UID: "dea339e9-bd2c-405e-8d52-49238be6a72b"). InnerVolumeSpecName "kube-api-access-pdn9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.063738 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dea339e9-bd2c-405e-8d52-49238be6a72b" (UID: "dea339e9-bd2c-405e-8d52-49238be6a72b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.064725 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data" (OuterVolumeSpecName: "config-data") pod "dea339e9-bd2c-405e-8d52-49238be6a72b" (UID: "dea339e9-bd2c-405e-8d52-49238be6a72b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.092331 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdn9z\" (UniqueName: \"kubernetes.io/projected/dea339e9-bd2c-405e-8d52-49238be6a72b-kube-api-access-pdn9z\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.092367 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.092383 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea339e9-bd2c-405e-8d52-49238be6a72b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.388392 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:04:04 crc kubenswrapper[4747]: W1205 21:04:04.391185 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9da10f9e_c9d3_4d5e_888a_774080f417f8.slice/crio-53a6a28aa82b52a7c750d642ef9a0bfb6b383e264098f48634cc0205322e678f WatchSource:0}: Error finding container 53a6a28aa82b52a7c750d642ef9a0bfb6b383e264098f48634cc0205322e678f: Status 404 returned error can't find the container with id 53a6a28aa82b52a7c750d642ef9a0bfb6b383e264098f48634cc0205322e678f Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.411254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerStarted","Data":"53a6a28aa82b52a7c750d642ef9a0bfb6b383e264098f48634cc0205322e678f"} Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.413002 4747 generic.go:334] "Generic (PLEG): container finished" podID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerID="36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075" exitCode=0 Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.413056 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.413104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerDied","Data":"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075"} Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.413140 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dea339e9-bd2c-405e-8d52-49238be6a72b","Type":"ContainerDied","Data":"2793580de01cacd9543fb213d66f1a90b0be3b4909d33bb2a486302dedc605b1"} Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.413169 4747 scope.go:117] "RemoveContainer" containerID="36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.447517 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.448709 4747 scope.go:117] "RemoveContainer" containerID="4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.467651 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.473374 4747 scope.go:117] "RemoveContainer" containerID="36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075" Dec 05 21:04:04 crc kubenswrapper[4747]: E1205 21:04:04.473957 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075\": container with ID starting with 36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075 not found: ID does not exist" containerID="36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.473996 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075"} err="failed to get container status \"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075\": rpc error: code = NotFound desc = could not find container \"36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075\": container with ID starting with 36e24cba429af81f4d604dcf109efd617f2beab641c8de30d112cfd3e1243075 not found: ID does not exist" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.474054 4747 scope.go:117] "RemoveContainer" containerID="4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40" Dec 05 21:04:04 crc kubenswrapper[4747]: E1205 21:04:04.474619 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40\": container with ID starting with 4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40 not found: ID does not exist" containerID="4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.474922 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40"} err="failed to get container status \"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40\": rpc error: code = NotFound desc = could not find container \"4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40\": container with ID starting with 4e4cff9b3bacdbf7f3acc1916d2ff5625f677d7e4fb5faff3cb25171f9744d40 not found: ID does not exist" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.520111 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:04 crc kubenswrapper[4747]: E1205 21:04:04.520524 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-api" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.520541 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-api" Dec 05 21:04:04 crc kubenswrapper[4747]: E1205 21:04:04.520553 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-log" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.520559 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-log" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.520731 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-api" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.520753 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" containerName="nova-api-log" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.522149 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.526408 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.526850 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.531633 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.543439 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.619680 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.620016 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqntk\" (UniqueName: \"kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.620139 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.620171 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.620244 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.620277 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.723186 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.723329 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqntk\" (UniqueName: \"kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.723735 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.723862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.724076 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.724224 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.724564 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.728356 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.728466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.730314 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.730383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.750704 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqntk\" (UniqueName: \"kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk\") pod \"nova-api-0\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " pod="openstack/nova-api-0" Dec 05 21:04:04 crc kubenswrapper[4747]: I1205 21:04:04.859130 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.312116 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:05 crc kubenswrapper[4747]: W1205 21:04:05.316143 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e45aac1_6d4b_49c5_aad8_3e4246d62fa7.slice/crio-dd2c78bc24c360f37f7c4973dc9ed1688bb61d54c97afeb52c41e86e741570a6 WatchSource:0}: Error finding container dd2c78bc24c360f37f7c4973dc9ed1688bb61d54c97afeb52c41e86e741570a6: Status 404 returned error can't find the container with id dd2c78bc24c360f37f7c4973dc9ed1688bb61d54c97afeb52c41e86e741570a6 Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.426893 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerStarted","Data":"dd2c78bc24c360f37f7c4973dc9ed1688bb61d54c97afeb52c41e86e741570a6"} Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.428911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerStarted","Data":"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b"} Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.699160 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.737367 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:04:05 crc kubenswrapper[4747]: I1205 21:04:05.850377 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea339e9-bd2c-405e-8d52-49238be6a72b" path="/var/lib/kubelet/pods/dea339e9-bd2c-405e-8d52-49238be6a72b/volumes" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.222116 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.222446 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.458484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerStarted","Data":"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6"} Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.458551 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerStarted","Data":"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d"} Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.473192 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerStarted","Data":"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc"} Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.473262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerStarted","Data":"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac"} Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.488892 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.488867578 podStartE2EDuration="2.488867578s" podCreationTimestamp="2025-12-05 21:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:06.480166013 +0000 UTC m=+1316.947473521" watchObservedRunningTime="2025-12-05 21:04:06.488867578 +0000 UTC m=+1316.956175076" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.500054 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.644798 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zqqvw"] Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.646636 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.652160 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.652340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.656709 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zqqvw"] Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.768515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.768663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.768716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssp6\" (UniqueName: \"kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.768753 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.869927 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.870027 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.870150 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.870223 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jssp6\" (UniqueName: \"kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.875905 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.875900 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.876098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.903635 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssp6\" (UniqueName: \"kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6\") pod \"nova-cell1-cell-mapping-zqqvw\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:06 crc kubenswrapper[4747]: I1205 21:04:06.966093 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:07 crc kubenswrapper[4747]: I1205 21:04:07.425443 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zqqvw"] Dec 05 21:04:07 crc kubenswrapper[4747]: I1205 21:04:07.488742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zqqvw" event={"ID":"f98ab795-6f74-421b-bce4-52128f0d7431","Type":"ContainerStarted","Data":"00e75ced6e632c79f3e6aaad093a99e8867f1077c3f0717e4df29376caea679d"} Dec 05 21:04:07 crc kubenswrapper[4747]: I1205 21:04:07.855752 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:04:07 crc kubenswrapper[4747]: I1205 21:04:07.939384 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:04:07 crc kubenswrapper[4747]: I1205 21:04:07.939683 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="dnsmasq-dns" containerID="cri-o://d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d" gracePeriod=10 Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.492782 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.509522 4747 generic.go:334] "Generic (PLEG): container finished" podID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerID="d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d" exitCode=0 Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.509612 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" event={"ID":"dd1da9c6-85af-4559-a2e8-f52210b7d5c4","Type":"ContainerDied","Data":"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d"} Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.509640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" event={"ID":"dd1da9c6-85af-4559-a2e8-f52210b7d5c4","Type":"ContainerDied","Data":"8f6077c39476fce72553ab3537821dd3a5d22fe329fa0f87673ff61e8e51a2de"} Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.509658 4747 scope.go:117] "RemoveContainer" containerID="d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.509778 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-f4f8q" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.512674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zqqvw" event={"ID":"f98ab795-6f74-421b-bce4-52128f0d7431","Type":"ContainerStarted","Data":"2be5a45020495919c8df1a96894165864e0b5d600e6abb5eb0ba83d8570f477b"} Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.516264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerStarted","Data":"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba"} Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.516943 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.541240 4747 scope.go:117] "RemoveContainer" containerID="00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.562300 4747 scope.go:117] "RemoveContainer" containerID="d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d" Dec 05 21:04:08 crc kubenswrapper[4747]: E1205 21:04:08.564315 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d\": container with ID starting with d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d not found: ID does not exist" containerID="d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.564353 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d"} err="failed to get container status \"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d\": rpc error: code = NotFound desc = could not find container \"d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d\": container with ID starting with d52e1c1c17b0c70a8eadb64326c0b596768d88b892f57e969f55647e6f07bb4d not found: ID does not exist" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.564380 4747 scope.go:117] "RemoveContainer" containerID="00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207" Dec 05 21:04:08 crc kubenswrapper[4747]: E1205 21:04:08.567098 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207\": container with ID starting with 00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207 not found: ID does not exist" containerID="00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.567163 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207"} err="failed to get container status \"00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207\": rpc error: code = NotFound desc = could not find container \"00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207\": container with ID starting with 00e7ce80f37c69a3249eb133f51c3d9d0ef9248981ff9d69faeeefb5e13ca207 not found: ID does not exist" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.578320 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zqqvw" podStartSLOduration=2.578301948 podStartE2EDuration="2.578301948s" podCreationTimestamp="2025-12-05 21:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:08.54156608 +0000 UTC m=+1319.008873568" watchObservedRunningTime="2025-12-05 21:04:08.578301948 +0000 UTC m=+1319.045609436" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.582066 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.414820007 podStartE2EDuration="5.58205521s" podCreationTimestamp="2025-12-05 21:04:03 +0000 UTC" firstStartedPulling="2025-12-05 21:04:04.39343982 +0000 UTC m=+1314.860747308" lastFinishedPulling="2025-12-05 21:04:07.560675023 +0000 UTC m=+1318.027982511" observedRunningTime="2025-12-05 21:04:08.560766404 +0000 UTC m=+1319.028073892" watchObservedRunningTime="2025-12-05 21:04:08.58205521 +0000 UTC m=+1319.049362698" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622366 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkz6g\" (UniqueName: \"kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622416 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622502 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622653 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.622705 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config\") pod \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\" (UID: \"dd1da9c6-85af-4559-a2e8-f52210b7d5c4\") " Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.631825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g" (OuterVolumeSpecName: "kube-api-access-qkz6g") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "kube-api-access-qkz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.688057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.690844 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.693032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config" (OuterVolumeSpecName: "config") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.709947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.710961 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd1da9c6-85af-4559-a2e8-f52210b7d5c4" (UID: "dd1da9c6-85af-4559-a2e8-f52210b7d5c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724891 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724937 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724952 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724966 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkz6g\" (UniqueName: \"kubernetes.io/projected/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-kube-api-access-qkz6g\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724981 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.724993 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd1da9c6-85af-4559-a2e8-f52210b7d5c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.841058 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:04:08 crc kubenswrapper[4747]: I1205 21:04:08.850637 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-f4f8q"] Dec 05 21:04:09 crc kubenswrapper[4747]: I1205 21:04:09.853674 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" path="/var/lib/kubelet/pods/dd1da9c6-85af-4559-a2e8-f52210b7d5c4/volumes" Dec 05 21:04:13 crc kubenswrapper[4747]: I1205 21:04:13.573009 4747 generic.go:334] "Generic (PLEG): container finished" podID="f98ab795-6f74-421b-bce4-52128f0d7431" containerID="2be5a45020495919c8df1a96894165864e0b5d600e6abb5eb0ba83d8570f477b" exitCode=0 Dec 05 21:04:13 crc kubenswrapper[4747]: I1205 21:04:13.573089 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zqqvw" event={"ID":"f98ab795-6f74-421b-bce4-52128f0d7431","Type":"ContainerDied","Data":"2be5a45020495919c8df1a96894165864e0b5d600e6abb5eb0ba83d8570f477b"} Dec 05 21:04:14 crc kubenswrapper[4747]: I1205 21:04:14.860346 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:04:14 crc kubenswrapper[4747]: I1205 21:04:14.860674 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.030761 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.053488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle\") pod \"f98ab795-6f74-421b-bce4-52128f0d7431\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.053595 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data\") pod \"f98ab795-6f74-421b-bce4-52128f0d7431\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.053645 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts\") pod \"f98ab795-6f74-421b-bce4-52128f0d7431\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.053695 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssp6\" (UniqueName: \"kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6\") pod \"f98ab795-6f74-421b-bce4-52128f0d7431\" (UID: \"f98ab795-6f74-421b-bce4-52128f0d7431\") " Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.062070 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6" (OuterVolumeSpecName: "kube-api-access-jssp6") pod "f98ab795-6f74-421b-bce4-52128f0d7431" (UID: "f98ab795-6f74-421b-bce4-52128f0d7431"). InnerVolumeSpecName "kube-api-access-jssp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.070111 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts" (OuterVolumeSpecName: "scripts") pod "f98ab795-6f74-421b-bce4-52128f0d7431" (UID: "f98ab795-6f74-421b-bce4-52128f0d7431"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.100145 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data" (OuterVolumeSpecName: "config-data") pod "f98ab795-6f74-421b-bce4-52128f0d7431" (UID: "f98ab795-6f74-421b-bce4-52128f0d7431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.116892 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f98ab795-6f74-421b-bce4-52128f0d7431" (UID: "f98ab795-6f74-421b-bce4-52128f0d7431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.156082 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.156145 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jssp6\" (UniqueName: \"kubernetes.io/projected/f98ab795-6f74-421b-bce4-52128f0d7431-kube-api-access-jssp6\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.156166 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.156185 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98ab795-6f74-421b-bce4-52128f0d7431-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.598021 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zqqvw" event={"ID":"f98ab795-6f74-421b-bce4-52128f0d7431","Type":"ContainerDied","Data":"00e75ced6e632c79f3e6aaad093a99e8867f1077c3f0717e4df29376caea679d"} Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.598067 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00e75ced6e632c79f3e6aaad093a99e8867f1077c3f0717e4df29376caea679d" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.598114 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zqqvw" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.787077 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.787462 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-log" containerID="cri-o://55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d" gracePeriod=30 Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.787666 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-api" containerID="cri-o://4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6" gracePeriod=30 Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.800974 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.801197 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerName="nova-scheduler-scheduler" containerID="cri-o://8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" gracePeriod=30 Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.804908 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": EOF" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.805110 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": EOF" Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.849913 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.850404 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" containerID="cri-o://46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a" gracePeriod=30 Dec 05 21:04:15 crc kubenswrapper[4747]: I1205 21:04:15.850476 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" containerID="cri-o://102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad" gracePeriod=30 Dec 05 21:04:16 crc kubenswrapper[4747]: E1205 21:04:16.501898 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:04:16 crc kubenswrapper[4747]: E1205 21:04:16.503395 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:04:16 crc kubenswrapper[4747]: E1205 21:04:16.504702 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:04:16 crc kubenswrapper[4747]: E1205 21:04:16.504744 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerName="nova-scheduler-scheduler" Dec 05 21:04:16 crc kubenswrapper[4747]: I1205 21:04:16.609004 4747 generic.go:334] "Generic (PLEG): container finished" podID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerID="46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a" exitCode=143 Dec 05 21:04:16 crc kubenswrapper[4747]: I1205 21:04:16.609061 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerDied","Data":"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a"} Dec 05 21:04:16 crc kubenswrapper[4747]: I1205 21:04:16.611143 4747 generic.go:334] "Generic (PLEG): container finished" podID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerID="55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d" exitCode=143 Dec 05 21:04:16 crc kubenswrapper[4747]: I1205 21:04:16.611181 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerDied","Data":"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d"} Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.000569 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:52744->10.217.0.189:8775: read: connection reset by peer" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.000731 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:52748->10.217.0.189:8775: read: connection reset by peer" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.506295 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.548150 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgff4\" (UniqueName: \"kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4\") pod \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.548489 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs\") pod \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.548557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle\") pod \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.548742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs\") pod \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.548831 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data\") pod \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\" (UID: \"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da\") " Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.551205 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs" (OuterVolumeSpecName: "logs") pod "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" (UID: "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.556782 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4" (OuterVolumeSpecName: "kube-api-access-hgff4") pod "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" (UID: "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da"). InnerVolumeSpecName "kube-api-access-hgff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.585919 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" (UID: "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.586068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data" (OuterVolumeSpecName: "config-data") pod "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" (UID: "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.624114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" (UID: "fbfcf4ac-4895-4724-a9a9-596ad8a3a6da"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651315 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651354 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651365 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgff4\" (UniqueName: \"kubernetes.io/projected/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-kube-api-access-hgff4\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651374 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651384 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651560 4747 generic.go:334] "Generic (PLEG): container finished" podID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerID="102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad" exitCode=0 Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651613 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerDied","Data":"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad"} Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfcf4ac-4895-4724-a9a9-596ad8a3a6da","Type":"ContainerDied","Data":"adb12b3f142882492ef7d6254b98a09d631553794082f9009e504b1076967ac2"} Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651657 4747 scope.go:117] "RemoveContainer" containerID="102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.651662 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.677273 4747 scope.go:117] "RemoveContainer" containerID="46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.696702 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.704963 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.705104 4747 scope.go:117] "RemoveContainer" containerID="102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.705656 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad\": container with ID starting with 102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad not found: ID does not exist" containerID="102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.705705 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad"} err="failed to get container status \"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad\": rpc error: code = NotFound desc = could not find container \"102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad\": container with ID starting with 102a058923a0d74f8a7d122621d1bf3d609e270bfaddd20e89feb6f1674c5fad not found: ID does not exist" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.705724 4747 scope.go:117] "RemoveContainer" containerID="46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.706030 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a\": container with ID starting with 46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a not found: ID does not exist" containerID="46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.706050 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a"} err="failed to get container status \"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a\": rpc error: code = NotFound desc = could not find container \"46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a\": container with ID starting with 46730cdd52d8afe7d6574648638970a2e8a396b15f1ecd335f0f402b9616d01a not found: ID does not exist" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.716492 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.716956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.716973 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.716985 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98ab795-6f74-421b-bce4-52128f0d7431" containerName="nova-manage" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.716993 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98ab795-6f74-421b-bce4-52128f0d7431" containerName="nova-manage" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.717000 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717007 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.717039 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="init" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717045 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="init" Dec 05 21:04:19 crc kubenswrapper[4747]: E1205 21:04:19.717058 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="dnsmasq-dns" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717066 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="dnsmasq-dns" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717457 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1da9c6-85af-4559-a2e8-f52210b7d5c4" containerName="dnsmasq-dns" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717476 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-metadata" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717489 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98ab795-6f74-421b-bce4-52128f0d7431" containerName="nova-manage" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.717499 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" containerName="nova-metadata-log" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.718453 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.720118 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.720401 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.724609 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.765392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.766697 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.766738 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.766784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.766977 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq882\" (UniqueName: \"kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.856509 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfcf4ac-4895-4724-a9a9-596ad8a3a6da" path="/var/lib/kubelet/pods/fbfcf4ac-4895-4724-a9a9-596ad8a3a6da/volumes" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.868249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.868640 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.868777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.868892 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.869057 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq882\" (UniqueName: \"kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.869110 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.874349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.874423 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.874603 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:19 crc kubenswrapper[4747]: I1205 21:04:19.886515 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq882\" (UniqueName: \"kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882\") pod \"nova-metadata-0\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " pod="openstack/nova-metadata-0" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.047355 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.507535 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:04:20 crc kubenswrapper[4747]: W1205 21:04:20.508099 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70527d7e_feb4_4821_b20d_74d9634ab124.slice/crio-15b3ca332b9ebb3bd1e9a69ad76916c072b50d1a01360ce6e7c84bdb064b0fb6 WatchSource:0}: Error finding container 15b3ca332b9ebb3bd1e9a69ad76916c072b50d1a01360ce6e7c84bdb064b0fb6: Status 404 returned error can't find the container with id 15b3ca332b9ebb3bd1e9a69ad76916c072b50d1a01360ce6e7c84bdb064b0fb6 Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.643963 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.681077 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerStarted","Data":"15b3ca332b9ebb3bd1e9a69ad76916c072b50d1a01360ce6e7c84bdb064b0fb6"} Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.686415 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-622s8\" (UniqueName: \"kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8\") pod \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.686567 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data\") pod \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.686851 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle\") pod \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\" (UID: \"7f2e4720-4c72-4ed6-a648-2d59f09d3137\") " Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.691410 4747 generic.go:334] "Generic (PLEG): container finished" podID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" exitCode=0 Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.691532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7f2e4720-4c72-4ed6-a648-2d59f09d3137","Type":"ContainerDied","Data":"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a"} Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.691625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7f2e4720-4c72-4ed6-a648-2d59f09d3137","Type":"ContainerDied","Data":"e385e14d19b70518d6fc8f31c2c66da16d4457342929167e57042c948cac47aa"} Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.691696 4747 scope.go:117] "RemoveContainer" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.691850 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.697904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8" (OuterVolumeSpecName: "kube-api-access-622s8") pod "7f2e4720-4c72-4ed6-a648-2d59f09d3137" (UID: "7f2e4720-4c72-4ed6-a648-2d59f09d3137"). InnerVolumeSpecName "kube-api-access-622s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.724878 4747 scope.go:117] "RemoveContainer" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" Dec 05 21:04:20 crc kubenswrapper[4747]: E1205 21:04:20.725493 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a\": container with ID starting with 8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a not found: ID does not exist" containerID="8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.725524 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a"} err="failed to get container status \"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a\": rpc error: code = NotFound desc = could not find container \"8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a\": container with ID starting with 8603c1ee01c3b6ce6d2fa5841663b76212744e032d810dddc1da65ac14eacf9a not found: ID does not exist" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.730822 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data" (OuterVolumeSpecName: "config-data") pod "7f2e4720-4c72-4ed6-a648-2d59f09d3137" (UID: "7f2e4720-4c72-4ed6-a648-2d59f09d3137"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.737754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f2e4720-4c72-4ed6-a648-2d59f09d3137" (UID: "7f2e4720-4c72-4ed6-a648-2d59f09d3137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.792767 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-622s8\" (UniqueName: \"kubernetes.io/projected/7f2e4720-4c72-4ed6-a648-2d59f09d3137-kube-api-access-622s8\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.792795 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:20 crc kubenswrapper[4747]: I1205 21:04:20.792808 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2e4720-4c72-4ed6-a648-2d59f09d3137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.032673 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.054711 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.065066 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:21 crc kubenswrapper[4747]: E1205 21:04:21.065537 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerName="nova-scheduler-scheduler" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.065567 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerName="nova-scheduler-scheduler" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.065837 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" containerName="nova-scheduler-scheduler" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.066668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.074090 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.093359 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.097394 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.097470 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwzn\" (UniqueName: \"kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.097553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.199556 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.199653 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwzn\" (UniqueName: \"kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.199738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.204040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.204187 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.218820 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwzn\" (UniqueName: \"kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn\") pod \"nova-scheduler-0\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.398713 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.656057 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.704149 4747 generic.go:334] "Generic (PLEG): container finished" podID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerID="4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6" exitCode=0 Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.704219 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerDied","Data":"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6"} Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.704260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7","Type":"ContainerDied","Data":"dd2c78bc24c360f37f7c4973dc9ed1688bb61d54c97afeb52c41e86e741570a6"} Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.704283 4747 scope.go:117] "RemoveContainer" containerID="4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.704227 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706487 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706685 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqntk\" (UniqueName: \"kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706769 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.706822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs\") pod \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\" (UID: \"0e45aac1-6d4b-49c5-aad8-3e4246d62fa7\") " Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.708201 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerStarted","Data":"a0b910eeeb5d4311cefe6de469772978a62f3bc013456ed000b893110cb13617"} Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.708237 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerStarted","Data":"5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba"} Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.708212 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs" (OuterVolumeSpecName: "logs") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.712507 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk" (OuterVolumeSpecName: "kube-api-access-pqntk") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "kube-api-access-pqntk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.729050 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.729030319 podStartE2EDuration="2.729030319s" podCreationTimestamp="2025-12-05 21:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:21.725862471 +0000 UTC m=+1332.193169969" watchObservedRunningTime="2025-12-05 21:04:21.729030319 +0000 UTC m=+1332.196337807" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.732991 4747 scope.go:117] "RemoveContainer" containerID="55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.736672 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data" (OuterVolumeSpecName: "config-data") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.747658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.752944 4747 scope.go:117] "RemoveContainer" containerID="4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6" Dec 05 21:04:21 crc kubenswrapper[4747]: E1205 21:04:21.753812 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6\": container with ID starting with 4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6 not found: ID does not exist" containerID="4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.753861 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6"} err="failed to get container status \"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6\": rpc error: code = NotFound desc = could not find container \"4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6\": container with ID starting with 4bf6f239755456ef0adabfe0b4e1a322b83be0779d8293bd7b3d1997a03e93b6 not found: ID does not exist" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.753893 4747 scope.go:117] "RemoveContainer" containerID="55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d" Dec 05 21:04:21 crc kubenswrapper[4747]: E1205 21:04:21.754265 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d\": container with ID starting with 55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d not found: ID does not exist" containerID="55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.754304 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d"} err="failed to get container status \"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d\": rpc error: code = NotFound desc = could not find container \"55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d\": container with ID starting with 55d7b1348f26b9761653a77f2e9ab1b5ad07bc8168ec754b06eb37a888a14b0d not found: ID does not exist" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.762461 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.773166 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" (UID: "0e45aac1-6d4b-49c5-aad8-3e4246d62fa7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809269 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809302 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809345 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqntk\" (UniqueName: \"kubernetes.io/projected/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-kube-api-access-pqntk\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809355 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809363 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.809371 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.836452 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:04:21 crc kubenswrapper[4747]: I1205 21:04:21.858239 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2e4720-4c72-4ed6-a648-2d59f09d3137" path="/var/lib/kubelet/pods/7f2e4720-4c72-4ed6-a648-2d59f09d3137/volumes" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.032655 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.051956 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.078103 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:22 crc kubenswrapper[4747]: E1205 21:04:22.078557 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-log" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.078572 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-log" Dec 05 21:04:22 crc kubenswrapper[4747]: E1205 21:04:22.078618 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-api" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.078628 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-api" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.078836 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-api" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.078860 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" containerName="nova-api-log" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.080150 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.080250 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.083018 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.083284 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.083342 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsdv\" (UniqueName: \"kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113420 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113463 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113500 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113523 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.113606 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216202 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.216722 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsdv\" (UniqueName: \"kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.217254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.221320 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.223852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.229657 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.230390 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.232722 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsdv\" (UniqueName: \"kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv\") pod \"nova-api-0\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.407041 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.725698 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9","Type":"ContainerStarted","Data":"1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5"} Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.726017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9","Type":"ContainerStarted","Data":"ff16ee1bcbefc5b718ca75caa07bd7eb87765d146569669e8a795a2be1beefd2"} Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.754348 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.754328173 podStartE2EDuration="1.754328173s" podCreationTimestamp="2025-12-05 21:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:22.744285125 +0000 UTC m=+1333.211592633" watchObservedRunningTime="2025-12-05 21:04:22.754328173 +0000 UTC m=+1333.221635651" Dec 05 21:04:22 crc kubenswrapper[4747]: W1205 21:04:22.875750 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60c19006_e7b7_4c36_847a_a52358ae6a99.slice/crio-2c84aa6a831fb3f2d3ce2ddc737b5bfc3edfce5147112056eced45bfd85b4b4d WatchSource:0}: Error finding container 2c84aa6a831fb3f2d3ce2ddc737b5bfc3edfce5147112056eced45bfd85b4b4d: Status 404 returned error can't find the container with id 2c84aa6a831fb3f2d3ce2ddc737b5bfc3edfce5147112056eced45bfd85b4b4d Dec 05 21:04:22 crc kubenswrapper[4747]: I1205 21:04:22.878971 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:04:23 crc kubenswrapper[4747]: I1205 21:04:23.737852 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerStarted","Data":"b4ea3b3b35eb186a6415e9ce4ffe9c01f8217ca9dfbb6942810043a66549038a"} Dec 05 21:04:23 crc kubenswrapper[4747]: I1205 21:04:23.738161 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerStarted","Data":"830bd4d5341560a621378cd94f40e4c4b0401787664fe8a6b8244fa2e35b90f2"} Dec 05 21:04:23 crc kubenswrapper[4747]: I1205 21:04:23.738179 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerStarted","Data":"2c84aa6a831fb3f2d3ce2ddc737b5bfc3edfce5147112056eced45bfd85b4b4d"} Dec 05 21:04:23 crc kubenswrapper[4747]: I1205 21:04:23.767177 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.76713385 podStartE2EDuration="1.76713385s" podCreationTimestamp="2025-12-05 21:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:04:23.755918412 +0000 UTC m=+1334.223225920" watchObservedRunningTime="2025-12-05 21:04:23.76713385 +0000 UTC m=+1334.234441348" Dec 05 21:04:23 crc kubenswrapper[4747]: I1205 21:04:23.850747 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e45aac1-6d4b-49c5-aad8-3e4246d62fa7" path="/var/lib/kubelet/pods/0e45aac1-6d4b-49c5-aad8-3e4246d62fa7/volumes" Dec 05 21:04:25 crc kubenswrapper[4747]: I1205 21:04:25.047924 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:04:25 crc kubenswrapper[4747]: I1205 21:04:25.048340 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 21:04:26 crc kubenswrapper[4747]: I1205 21:04:26.400401 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 21:04:30 crc kubenswrapper[4747]: I1205 21:04:30.048022 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 21:04:30 crc kubenswrapper[4747]: I1205 21:04:30.048554 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 21:04:31 crc kubenswrapper[4747]: I1205 21:04:31.062866 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:04:31 crc kubenswrapper[4747]: I1205 21:04:31.062867 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:04:31 crc kubenswrapper[4747]: I1205 21:04:31.400214 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 21:04:31 crc kubenswrapper[4747]: I1205 21:04:31.446277 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 21:04:31 crc kubenswrapper[4747]: I1205 21:04:31.911441 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 21:04:32 crc kubenswrapper[4747]: I1205 21:04:32.409246 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:04:32 crc kubenswrapper[4747]: I1205 21:04:32.410141 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 21:04:33 crc kubenswrapper[4747]: I1205 21:04:33.407976 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:04:33 crc kubenswrapper[4747]: I1205 21:04:33.412902 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 21:04:33 crc kubenswrapper[4747]: I1205 21:04:33.958286 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 21:04:36 crc kubenswrapper[4747]: I1205 21:04:36.222542 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:04:36 crc kubenswrapper[4747]: I1205 21:04:36.222949 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:04:40 crc kubenswrapper[4747]: I1205 21:04:40.055491 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 21:04:40 crc kubenswrapper[4747]: I1205 21:04:40.058337 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 21:04:40 crc kubenswrapper[4747]: I1205 21:04:40.067699 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 21:04:40 crc kubenswrapper[4747]: I1205 21:04:40.950297 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.415852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.416409 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.420187 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.422957 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.958727 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 21:04:42 crc kubenswrapper[4747]: I1205 21:04:42.964064 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 21:04:59 crc kubenswrapper[4747]: I1205 21:04:59.746500 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 21:04:59 crc kubenswrapper[4747]: I1205 21:04:59.758894 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9a37ca14-4557-4c5c-b8b8-be6776516c83" containerName="openstackclient" containerID="cri-o://50b55da7432690c6f59d396246de6b3e8a89b3871ba09c858ae1541632412c92" gracePeriod=2 Dec 05 21:04:59 crc kubenswrapper[4747]: I1205 21:04:59.768835 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 21:04:59 crc kubenswrapper[4747]: E1205 21:04:59.961422 4747 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 05 21:04:59 crc kubenswrapper[4747]: E1205 21:04:59.961453 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-77d8776c8f-zkk2b: secret "swift-conf" not found Dec 05 21:04:59 crc kubenswrapper[4747]: E1205 21:04:59.963177 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift podName:f5bfe363-0d37-4380-93a8-dc7ea1ad3392 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:00.463161274 +0000 UTC m=+1370.930468762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift") pod "swift-proxy-77d8776c8f-zkk2b" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392") : secret "swift-conf" not found Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.026644 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.027090 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a37ca14-4557-4c5c-b8b8-be6776516c83" containerName="openstackclient" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.027110 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a37ca14-4557-4c5c-b8b8-be6776516c83" containerName="openstackclient" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.027304 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a37ca14-4557-4c5c-b8b8-be6776516c83" containerName="openstackclient" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.040400 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.062895 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.062975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr48\" (UniqueName: \"kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.073681 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.103814 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.120149 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.120481 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="openstack-network-exporter" containerID="cri-o://c2b5ad7c71ac902cf882c6a823d0b022b018b3f0655877bb3292fa03d68e3bae" gracePeriod=300 Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.165973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.166052 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr48\" (UniqueName: \"kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.167678 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.204161 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr48\" (UniqueName: \"kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48\") pod \"cinder067d-account-delete-tcx7x\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.242032 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.243414 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.254313 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.272283 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.272355 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data podName:70db507e-cc84-4722-8ac8-fd659c2803b8 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:00.772337316 +0000 UTC m=+1371.239644804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8") : configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.289675 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.291088 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.343760 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="ovsdbserver-sb" containerID="cri-o://0e2fa2cf0521e9e0a9ee440e9496bbd60e565938c9bcb968eb76b83f8a5212ad" gracePeriod=300 Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.396922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.396983 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hwk\" (UniqueName: \"kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.397060 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpq9n\" (UniqueName: \"kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.397544 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.399408 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.433949 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.470171 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rmcvl"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.480314 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rmcvl"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.506921 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.506998 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.507025 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hwk\" (UniqueName: \"kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.507073 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpq9n\" (UniqueName: \"kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.507260 4747 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.507281 4747 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.507291 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.507302 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-77d8776c8f-zkk2b: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.507349 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift podName:f5bfe363-0d37-4380-93a8-dc7ea1ad3392 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:01.507331035 +0000 UTC m=+1371.974638523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift") pod "swift-proxy-77d8776c8f-zkk2b" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.507659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.508035 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.537834 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.539002 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.548518 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpq9n\" (UniqueName: \"kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n\") pod \"placement8da4-account-delete-8nv8m\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.558858 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.571963 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-x4x47"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.580411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hwk\" (UniqueName: \"kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk\") pod \"glancee01d-account-delete-6jnvw\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.616151 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.619892 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-x4x47"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.667157 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.697540 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.713346 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkkr\" (UniqueName: \"kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.713474 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.713748 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.713810 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data podName:d49bd09d-90af-4f00-a333-0e292c581525 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:01.213793999 +0000 UTC m=+1371.681101477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data") pod "rabbitmq-server-0" (UID: "d49bd09d-90af-4f00-a333-0e292c581525") : configmap "rabbitmq-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.736161 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.737680 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.756286 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.795819 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.796032 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="ovn-northd" containerID="cri-o://264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" gracePeriod=30 Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.796378 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="openstack-network-exporter" containerID="cri-o://a9dcbb54d496caf2f805d60df04f610d71d46e34c98b40703cbee95f6436ad6a" gracePeriod=30 Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.819469 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.820274 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7cl\" (UniqueName: \"kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.820324 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkkr\" (UniqueName: \"kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.820355 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.821483 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.821548 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.829206 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:05:00 crc kubenswrapper[4747]: E1205 21:05:00.835091 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data podName:70db507e-cc84-4722-8ac8-fd659c2803b8 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:01.835038696 +0000 UTC m=+1372.302346184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8") : configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.835136 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="dnsmasq-dns" containerID="cri-o://ebc61242a667918cfc3cf1987e66b55a1c850c340da8529513e354533f3f843e" gracePeriod=10 Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.919668 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.921050 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.922819 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7cl\" (UniqueName: \"kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.922882 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.943965 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.950305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkkr\" (UniqueName: \"kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr\") pod \"barbicancd71-account-delete-gjncb\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.952005 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.966281 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ncncg"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.974651 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-5jxck"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.981989 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ncncg"] Dec 05 21:05:00 crc kubenswrapper[4747]: I1205 21:05:00.991833 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-5jxck"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.002375 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.024850 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sc2\" (UniqueName: \"kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.024942 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.026240 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7cl\" (UniqueName: \"kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl\") pod \"neutron7680-account-delete-f82hm\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.026966 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.028675 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.037977 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.055654 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.107701 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fvckl"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.130125 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sc2\" (UniqueName: \"kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.130233 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pwv\" (UniqueName: \"kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.130286 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.130439 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.130952 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.146559 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fvckl"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.166105 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.183783 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.184254 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="cinder-scheduler" containerID="cri-o://f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.184377 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="probe" containerID="cri-o://b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.209535 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.220319 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sc2\" (UniqueName: \"kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2\") pod \"novaapi48b7-account-delete-25wjk\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.232880 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.233008 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pwv\" (UniqueName: \"kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.233330 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.233376 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data podName:d49bd09d-90af-4f00-a333-0e292c581525 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:02.233361093 +0000 UTC m=+1372.700668581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data") pod "rabbitmq-server-0" (UID: "d49bd09d-90af-4f00-a333-0e292c581525") : configmap "rabbitmq-config-data" not found Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.241767 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.245002 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.245246 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-5jh4k" podUID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" containerName="openstack-network-exporter" containerID="cri-o://5e62b0a320297c30e0688db6c5b330d66aadcb57092284da879891e956764e37" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.270106 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.278006 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pwv\" (UniqueName: \"kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv\") pod \"novacell033f5-account-delete-dk9n6\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.293089 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9cc30174-a5ca-454e-b049-59a62d358d8a/ovsdbserver-sb/0.log" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.293279 4747 generic.go:334] "Generic (PLEG): container finished" podID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerID="c2b5ad7c71ac902cf882c6a823d0b022b018b3f0655877bb3292fa03d68e3bae" exitCode=2 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.293352 4747 generic.go:334] "Generic (PLEG): container finished" podID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerID="0e2fa2cf0521e9e0a9ee440e9496bbd60e565938c9bcb968eb76b83f8a5212ad" exitCode=143 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.293438 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerDied","Data":"c2b5ad7c71ac902cf882c6a823d0b022b018b3f0655877bb3292fa03d68e3bae"} Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.293520 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerDied","Data":"0e2fa2cf0521e9e0a9ee440e9496bbd60e565938c9bcb968eb76b83f8a5212ad"} Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.303099 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2ns9l"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.333083 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2ns9l"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.373039 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.377685 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api-log" containerID="cri-o://454cb64b4de071bc5408ffd45978d6472ff63dcb85495e7423c6ead0953090f8" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.381541 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api" containerID="cri-o://7ad7169da0407cc9fa1fb2412f29f1fddcb02d5a765a42193125782a41cbb169" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.387028 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.481639 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.482406 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-646c94b678-5975b" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-log" containerID="cri-o://44bc0bd3c6f0a47cd14c827e342132010b773d5ce7c6430fcd671f4d1d9ea58a" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.482757 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-646c94b678-5975b" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-api" containerID="cri-o://450bad02541ca9f2045559c849365b5204f10379fb98f7f20ee63d82eaedc639" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.514573 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.515171 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="openstack-network-exporter" containerID="cri-o://63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430" gracePeriod=300 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.529222 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7528af7d-8d58-4f20-ac03-b9b62e14c9e2/ovn-northd/0.log" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.529275 4747 generic.go:334] "Generic (PLEG): container finished" podID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerID="264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" exitCode=143 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.529311 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerDied","Data":"264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034"} Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.549139 4747 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.549169 4747 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.549186 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.549198 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-77d8776c8f-zkk2b: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.549256 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift podName:f5bfe363-0d37-4380-93a8-dc7ea1ad3392 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:03.54923927 +0000 UTC m=+1374.016546758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift") pod "swift-proxy-77d8776c8f-zkk2b" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.566441 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zqqvw"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.646189 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:05:01 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 21:05:01 crc kubenswrapper[4747]: > Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.667702 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6pxgq"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.695637 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zqqvw"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.715391 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6pxgq"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.751346 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752146 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-server" containerID="cri-o://dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752500 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="swift-recon-cron" containerID="cri-o://486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752551 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="rsync" containerID="cri-o://b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752616 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-expirer" containerID="cri-o://0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752661 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-updater" containerID="cri-o://2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752708 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-server" containerID="cri-o://a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752712 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-replicator" containerID="cri-o://d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752760 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-reaper" containerID="cri-o://e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752804 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-auditor" containerID="cri-o://b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752830 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-auditor" containerID="cri-o://c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752843 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-replicator" containerID="cri-o://a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752874 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-replicator" containerID="cri-o://8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752906 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-server" containerID="cri-o://5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752924 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-updater" containerID="cri-o://9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.752943 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-auditor" containerID="cri-o://50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.759979 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.766526 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.766813 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-log" containerID="cri-o://d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.767235 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-httpd" containerID="cri-o://90486f36ff75a740556644fedc7ad1ada74d9386b6373ae4001a5f28f3f94e44" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.774502 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.774995 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-855575d477-mmmm7" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-api" containerID="cri-o://5ff576908ae6828159afb6725bcd8f5262dbf45a7c1d981f6bf781737d654133" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.775452 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-855575d477-mmmm7" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-httpd" containerID="cri-o://0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.803111 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.803898 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-httpd" containerID="cri-o://1bc4fa29702116af6fcd59bcbbca74418678e340a4d1a0b7cd112308e8bc0b4e" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.803695 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-log" containerID="cri-o://541e9e17ac501f698eab7f861a294ddf49e04de6ec4df244e43f29b858e30226" gracePeriod=30 Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.817897 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.881770 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:01 crc kubenswrapper[4747]: E1205 21:05:01.881828 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data podName:70db507e-cc84-4722-8ac8-fd659c2803b8 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:03.881814462 +0000 UTC m=+1374.349121950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8") : configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.928161 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b59f33-a929-4857-9d4d-d15a58667ba2" path="/var/lib/kubelet/pods/07b59f33-a929-4857-9d4d-d15a58667ba2/volumes" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.930944 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ec3037-277a-454c-b807-ceb5e626e724" path="/var/lib/kubelet/pods/24ec3037-277a-454c-b807-ceb5e626e724/volumes" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.931705 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4e99ae-15b6-4c55-924b-84ed464c130c" path="/var/lib/kubelet/pods/5f4e99ae-15b6-4c55-924b-84ed464c130c/volumes" Dec 05 21:05:01 crc kubenswrapper[4747]: I1205 21:05:01.932323 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7264db97-0a43-4984-94a2-1805f3aec313" path="/var/lib/kubelet/pods/7264db97-0a43-4984-94a2-1805f3aec313/volumes" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.954635 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749c2c93-a078-47ab-b6f9-673cef723e20" path="/var/lib/kubelet/pods/749c2c93-a078-47ab-b6f9-673cef723e20/volumes" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.955371 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08375df-ebe2-42ed-a2ac-19c6365b9f87" path="/var/lib/kubelet/pods/e08375df-ebe2-42ed-a2ac-19c6365b9f87/volumes" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.955961 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e26fb4-e5eb-4a54-abaa-b33101f25f61" path="/var/lib/kubelet/pods/f3e26fb4-e5eb-4a54-abaa-b33101f25f61/volumes" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.968923 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f98ab795-6f74-421b-bce4-52128f0d7431" path="/var/lib/kubelet/pods/f98ab795-6f74-421b-bce4-52128f0d7431/volumes" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.969682 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.969722 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.969737 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.969991 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" containerID="cri-o://5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.970207 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-77d8776c8f-zkk2b" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-httpd" containerID="cri-o://19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.970368 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-log" containerID="cri-o://830bd4d5341560a621378cd94f40e4c4b0401787664fe8a6b8244fa2e35b90f2" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.972443 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" containerID="cri-o://a0b910eeeb5d4311cefe6de469772978a62f3bc013456ed000b893110cb13617" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.972659 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-api" containerID="cri-o://b4ea3b3b35eb186a6415e9ce4ffe9c01f8217ca9dfbb6942810043a66549038a" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:01.972655 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-77d8776c8f-zkk2b" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-server" containerID="cri-o://0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.013716 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.029978 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.030301 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker-log" containerID="cri-o://fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.030459 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker" containerID="cri-o://0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.055701 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.055966 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c7dd97d4-lfdpd" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api-log" containerID="cri-o://e81f5c9fbda1eb9ab9778805ee52bf5015c30d3532093b0e7bc22a3a44f23a55" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.056096 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c7dd97d4-lfdpd" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api" containerID="cri-o://da5fc8501fba7bc707db385bce6aba964f09ddb107d30488526ab95f96cb2e21" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.078649 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.078955 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener-log" containerID="cri-o://0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.079184 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener" containerID="cri-o://0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.096117 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sv2s2"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.109410 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a003-account-create-update-8kxhx"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.126901 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sv2s2"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.160557 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a003-account-create-update-8kxhx"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.172004 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.172197 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.184211 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.189386 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.204975 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerName="nova-scheduler-scheduler" containerID="cri-o://1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.293918 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.293986 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data podName:d49bd09d-90af-4f00-a333-0e292c581525 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:04.29397126 +0000 UTC m=+1374.761278748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data") pod "rabbitmq-server-0" (UID: "d49bd09d-90af-4f00-a333-0e292c581525") : configmap "rabbitmq-config-data" not found Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.350266 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="rabbitmq" containerID="cri-o://a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c" gracePeriod=604800 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.408490 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="rabbitmq" containerID="cri-o://591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79" gracePeriod=604800 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.443813 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="ovsdbserver-nb" containerID="cri-o://57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305" gracePeriod=300 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.445144 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccdpj"] Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.461921 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034 is running failed: container process not found" containerID="264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.462775 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034 is running failed: container process not found" containerID="264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.467147 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034 is running failed: container process not found" containerID="264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.467184 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="ovn-northd" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.535456 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccdpj"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.572620 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.572828 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerName="nova-cell1-conductor-conductor" containerID="cri-o://bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.581680 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlx4b"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.597569 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.597795 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.615828 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlx4b"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.635871 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7528af7d-8d58-4f20-ac03-b9b62e14c9e2/ovn-northd/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.635918 4747 generic.go:334] "Generic (PLEG): container finished" podID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerID="a9dcbb54d496caf2f805d60df04f610d71d46e34c98b40703cbee95f6436ad6a" exitCode=2 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.635963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerDied","Data":"a9dcbb54d496caf2f805d60df04f610d71d46e34c98b40703cbee95f6436ad6a"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.636011 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7528af7d-8d58-4f20-ac03-b9b62e14c9e2","Type":"ContainerDied","Data":"92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.636025 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92f9a0595c63e82902393256176a9846afa062eec48ddb12c721354934118909" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.665862 4747 generic.go:334] "Generic (PLEG): container finished" podID="70527d7e-feb4-4821-b20d-74d9634ab124" containerID="5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.666226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerDied","Data":"5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.673788 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="galera" containerID="cri-o://911dc2af5532c982dccb86d6364ac90a2fcc902c0d91f743b41d97c4cf784943" gracePeriod=30 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.698015 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.698066 4747 generic.go:334] "Generic (PLEG): container finished" podID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerID="fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.698126 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerDied","Data":"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.713358 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6d835978-9804-4427-9dd4-48c40ad829c5/ovsdbserver-nb/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.713402 4747 generic.go:334] "Generic (PLEG): container finished" podID="6d835978-9804-4427-9dd4-48c40ad829c5" containerID="63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430" exitCode=2 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.713496 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerDied","Data":"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.715155 4747 generic.go:334] "Generic (PLEG): container finished" podID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerID="e81f5c9fbda1eb9ab9778805ee52bf5015c30d3532093b0e7bc22a3a44f23a55" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.715193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerDied","Data":"e81f5c9fbda1eb9ab9778805ee52bf5015c30d3532093b0e7bc22a3a44f23a55"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.717109 4747 generic.go:334] "Generic (PLEG): container finished" podID="62399e6a-577a-4d10-b057-49c4bae7a172" containerID="454cb64b4de071bc5408ffd45978d6472ff63dcb85495e7423c6ead0953090f8" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.717148 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerDied","Data":"454cb64b4de071bc5408ffd45978d6472ff63dcb85495e7423c6ead0953090f8"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.722249 4747 generic.go:334] "Generic (PLEG): container finished" podID="1d350f3b-2497-4941-a006-84a503604020" containerID="0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.722288 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerDied","Data":"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.740349 4747 generic.go:334] "Generic (PLEG): container finished" podID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerID="0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.740417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerDied","Data":"0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.746099 4747 generic.go:334] "Generic (PLEG): container finished" podID="466afd16-6e7d-42fd-bd82-cabab660b344" containerID="44bc0bd3c6f0a47cd14c827e342132010b773d5ce7c6430fcd671f4d1d9ea58a" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.746144 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerDied","Data":"44bc0bd3c6f0a47cd14c827e342132010b773d5ce7c6430fcd671f4d1d9ea58a"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.747721 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-77d8776c8f-zkk2b" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.166:8080/healthcheck\": dial tcp 10.217.0.166:8080: connect: connection refused" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.748050 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-77d8776c8f-zkk2b" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.166:8080/healthcheck\": dial tcp 10.217.0.166:8080: connect: connection refused" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.749284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder067d-account-delete-tcx7x" event={"ID":"60728349-9aa2-474c-b935-6fc915822d4e","Type":"ContainerStarted","Data":"3ed26b9f21173682f330768516bbaab79a529d29a49fdb0d3221bf16c57a0d4c"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.750789 4747 generic.go:334] "Generic (PLEG): container finished" podID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerID="19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.750832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerDied","Data":"19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767096 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767137 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767146 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767155 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767163 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767171 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767179 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767187 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767196 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767202 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767209 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767214 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767220 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767227 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767291 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767301 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767335 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767344 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767360 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767387 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.767394 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.771749 4747 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerID="ebc61242a667918cfc3cf1987e66b55a1c850c340da8529513e354533f3f843e" exitCode=0 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.771787 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" event={"ID":"bf40d0cc-0d14-4c55-986d-2809df27c4fd","Type":"ContainerDied","Data":"ebc61242a667918cfc3cf1987e66b55a1c850c340da8529513e354533f3f843e"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.773121 4747 generic.go:334] "Generic (PLEG): container finished" podID="9a37ca14-4557-4c5c-b8b8-be6776516c83" containerID="50b55da7432690c6f59d396246de6b3e8a89b3871ba09c858ae1541632412c92" exitCode=137 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.774603 4747 generic.go:334] "Generic (PLEG): container finished" podID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerID="830bd4d5341560a621378cd94f40e4c4b0401787664fe8a6b8244fa2e35b90f2" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.774637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerDied","Data":"830bd4d5341560a621378cd94f40e4c4b0401787664fe8a6b8244fa2e35b90f2"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.776179 4747 generic.go:334] "Generic (PLEG): container finished" podID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerID="d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.776215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerDied","Data":"d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d"} Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.796564 4747 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 05 21:05:02 crc kubenswrapper[4747]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 21:05:02 crc kubenswrapper[4747]: + source /usr/local/bin/container-scripts/functions Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNBridge=br-int Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNRemote=tcp:localhost:6642 Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNEncapType=geneve Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNAvailabilityZones= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ EnableChassisAsGateway=true Dec 05 21:05:02 crc kubenswrapper[4747]: ++ PhysicalNetworks= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNHostName= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 21:05:02 crc kubenswrapper[4747]: ++ ovs_dir=/var/lib/openvswitch Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 21:05:02 crc kubenswrapper[4747]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + cleanup_ovsdb_server_semaphore Dec 05 21:05:02 crc kubenswrapper[4747]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 21:05:02 crc kubenswrapper[4747]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-4d2dp" message=< Dec 05 21:05:02 crc kubenswrapper[4747]: Exiting ovsdb-server (5) [ OK ] Dec 05 21:05:02 crc kubenswrapper[4747]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 21:05:02 crc kubenswrapper[4747]: + source /usr/local/bin/container-scripts/functions Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNBridge=br-int Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNRemote=tcp:localhost:6642 Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNEncapType=geneve Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNAvailabilityZones= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ EnableChassisAsGateway=true Dec 05 21:05:02 crc kubenswrapper[4747]: ++ PhysicalNetworks= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNHostName= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 21:05:02 crc kubenswrapper[4747]: ++ ovs_dir=/var/lib/openvswitch Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 21:05:02 crc kubenswrapper[4747]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + cleanup_ovsdb_server_semaphore Dec 05 21:05:02 crc kubenswrapper[4747]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 21:05:02 crc kubenswrapper[4747]: > Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.796618 4747 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 05 21:05:02 crc kubenswrapper[4747]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 21:05:02 crc kubenswrapper[4747]: + source /usr/local/bin/container-scripts/functions Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNBridge=br-int Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNRemote=tcp:localhost:6642 Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNEncapType=geneve Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNAvailabilityZones= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ EnableChassisAsGateway=true Dec 05 21:05:02 crc kubenswrapper[4747]: ++ PhysicalNetworks= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ OVNHostName= Dec 05 21:05:02 crc kubenswrapper[4747]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 21:05:02 crc kubenswrapper[4747]: ++ ovs_dir=/var/lib/openvswitch Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 21:05:02 crc kubenswrapper[4747]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 21:05:02 crc kubenswrapper[4747]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + sleep 0.5 Dec 05 21:05:02 crc kubenswrapper[4747]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 21:05:02 crc kubenswrapper[4747]: + cleanup_ovsdb_server_semaphore Dec 05 21:05:02 crc kubenswrapper[4747]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 21:05:02 crc kubenswrapper[4747]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 21:05:02 crc kubenswrapper[4747]: > pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" containerID="cri-o://5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.796651 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" containerID="cri-o://5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" gracePeriod=29 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.800690 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" containerID="cri-o://ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" gracePeriod=29 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.807259 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9cc30174-a5ca-454e-b049-59a62d358d8a/ovsdbserver-sb/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.807340 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9cc30174-a5ca-454e-b049-59a62d358d8a","Type":"ContainerDied","Data":"0b7f872b85ae4681314c080e585d2933ebe9cf9bf437dd3d7c82642455a65b7b"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.807361 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b7f872b85ae4681314c080e585d2933ebe9cf9bf437dd3d7c82642455a65b7b" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.814705 4747 generic.go:334] "Generic (PLEG): container finished" podID="ac080b76-32d3-498c-9832-d31494c1d21f" containerID="541e9e17ac501f698eab7f861a294ddf49e04de6ec4df244e43f29b858e30226" exitCode=143 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.814765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerDied","Data":"541e9e17ac501f698eab7f861a294ddf49e04de6ec4df244e43f29b858e30226"} Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.816788 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5jh4k_8bc768ee-26e3-43fb-b64d-7fc3f3c627c5/openstack-network-exporter/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.816820 4747 generic.go:334] "Generic (PLEG): container finished" podID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" containerID="5e62b0a320297c30e0688db6c5b330d66aadcb57092284da879891e956764e37" exitCode=2 Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.816839 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5jh4k" event={"ID":"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5","Type":"ContainerDied","Data":"5e62b0a320297c30e0688db6c5b330d66aadcb57092284da879891e956764e37"} Dec 05 21:05:02 crc kubenswrapper[4747]: E1205 21:05:02.863992 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bfe363_0d37_4380_93a8_dc7ea1ad3392.slice/crio-conmon-0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d835978_9804_4427_9dd4_48c40ad829c5.slice/crio-57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70527d7e_feb4_4821_b20d_74d9634ab124.slice/crio-conmon-5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bfe363_0d37_4380_93a8_dc7ea1ad3392.slice/crio-19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ded45d9_c7e1_4429_982b_4f7c10e43473.slice/crio-0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-conmon-9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ded45d9_c7e1_4429_982b_4f7c10e43473.slice/crio-conmon-0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bfe363_0d37_4380_93a8_dc7ea1ad3392.slice/crio-0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d350f3b_2497_4941_a006_84a503604020.slice/crio-conmon-0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37900c0_058e_4be6_9b81_c67d5f05b7a5.slice/crio-5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-conmon-b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-conmon-2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37900c0_058e_4be6_9b81_c67d5f05b7a5.slice/crio-conmon-5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-conmon-dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e4dab2_e2a7_4fe3_949a_6a31460e11ba.slice/crio-conmon-d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b073c9_c8ff_4de9_b2ee_605d37c94ffb.slice/crio-conmon-fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d835978_9804_4427_9dd4_48c40ad829c5.slice/crio-conmon-57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c51dab_4948_4ca3_94ba_c25cb3a4e280.slice/crio-conmon-f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f06375c_a008_4d1f_b2ae_516549bcd438.slice/crio-conmon-5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.875937 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.910950 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9cc30174-a5ca-454e-b049-59a62d358d8a/ovsdbserver-sb/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.911013 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.932884 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7528af7d-8d58-4f20-ac03-b9b62e14c9e2/ovn-northd/0.log" Dec 05 21:05:02 crc kubenswrapper[4747]: I1205 21:05:02.932970 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.012430 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014258 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014290 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014308 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014346 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014374 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014408 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014447 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014494 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014513 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014574 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014677 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ppsc\" (UniqueName: \"kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014721 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs\") pod \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\" (UID: \"7528af7d-8d58-4f20-ac03-b9b62e14c9e2\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014738 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhvtw\" (UniqueName: \"kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw\") pod \"9cc30174-a5ca-454e-b049-59a62d358d8a\" (UID: \"9cc30174-a5ca-454e-b049-59a62d358d8a\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.014788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.015344 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.015806 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config" (OuterVolumeSpecName: "config") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.015908 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config" (OuterVolumeSpecName: "config") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.016241 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts" (OuterVolumeSpecName: "scripts") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.016888 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.022125 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.023901 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts" (OuterVolumeSpecName: "scripts") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.026738 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc" (OuterVolumeSpecName: "kube-api-access-8ppsc") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "kube-api-access-8ppsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.031842 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw" (OuterVolumeSpecName: "kube-api-access-lhvtw") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "kube-api-access-lhvtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.032285 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5jh4k_8bc768ee-26e3-43fb-b64d-7fc3f3c627c5/openstack-network-exporter/0.log" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.032351 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.040536 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.044819 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.100027 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.101590 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.103682 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.103720 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.107830 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.112948 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116602 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116723 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsc6\" (UniqueName: \"kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6\") pod \"9a37ca14-4557-4c5c-b8b8-be6776516c83\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116762 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config\") pod \"9a37ca14-4557-4c5c-b8b8-be6776516c83\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116908 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116939 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.116983 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle\") pod \"9a37ca14-4557-4c5c-b8b8-be6776516c83\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117102 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117140 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117209 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117719 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117784 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfmrp\" (UniqueName: \"kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117821 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb\") pod \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\" (UID: \"bf40d0cc-0d14-4c55-986d-2809df27c4fd\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117843 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnzrm\" (UniqueName: \"kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm\") pod \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\" (UID: \"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.117879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret\") pod \"9a37ca14-4557-4c5c-b8b8-be6776516c83\" (UID: \"9a37ca14-4557-4c5c-b8b8-be6776516c83\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118505 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118528 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118540 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118552 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118608 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118622 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118635 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118648 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ppsc\" (UniqueName: \"kubernetes.io/projected/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-kube-api-access-8ppsc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118660 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cc30174-a5ca-454e-b049-59a62d358d8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.118671 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhvtw\" (UniqueName: \"kubernetes.io/projected/9cc30174-a5ca-454e-b049-59a62d358d8a-kube-api-access-lhvtw\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.120857 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.127935 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config" (OuterVolumeSpecName: "config") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.128002 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.169632 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm" (OuterVolumeSpecName: "kube-api-access-rnzrm") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "kube-api-access-rnzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.170224 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp" (OuterVolumeSpecName: "kube-api-access-lfmrp") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "kube-api-access-lfmrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.182427 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6" (OuterVolumeSpecName: "kube-api-access-dpsc6") pod "9a37ca14-4557-4c5c-b8b8-be6776516c83" (UID: "9a37ca14-4557-4c5c-b8b8-be6776516c83"). InnerVolumeSpecName "kube-api-access-dpsc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229346 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229380 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfmrp\" (UniqueName: \"kubernetes.io/projected/bf40d0cc-0d14-4c55-986d-2809df27c4fd-kube-api-access-lfmrp\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229390 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnzrm\" (UniqueName: \"kubernetes.io/projected/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-kube-api-access-rnzrm\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229398 4747 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229408 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsc6\" (UniqueName: \"kubernetes.io/projected/9a37ca14-4557-4c5c-b8b8-be6776516c83-kube-api-access-dpsc6\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.229416 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.294935 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6d835978-9804-4427-9dd4-48c40ad829c5/ovsdbserver-nb/0.log" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.295012 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.348116 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441896 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441935 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tnbg\" (UniqueName: \"kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441964 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.441987 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.445022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.445095 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs\") pod \"6d835978-9804-4427-9dd4-48c40ad829c5\" (UID: \"6d835978-9804-4427-9dd4-48c40ad829c5\") " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.445664 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config" (OuterVolumeSpecName: "config") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.446060 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.446090 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.446778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts" (OuterVolumeSpecName: "scripts") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.447490 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.453838 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.453952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a37ca14-4557-4c5c-b8b8-be6776516c83" (UID: "9a37ca14-4557-4c5c-b8b8-be6776516c83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.462350 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg" (OuterVolumeSpecName: "kube-api-access-9tnbg") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "kube-api-access-9tnbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.485346 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config" (OuterVolumeSpecName: "config") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.524855 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.528933 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.531378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.540290 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.547948 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548029 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548042 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548050 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d835978-9804-4427-9dd4-48c40ad829c5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548067 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548078 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tnbg\" (UniqueName: \"kubernetes.io/projected/6d835978-9804-4427-9dd4-48c40ad829c5-kube-api-access-9tnbg\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548087 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.548094 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: W1205 21:05:03.557130 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d610aeb_7ead_4ea7_ae34_e8f9c2a14c6a.slice/crio-fc7c312259b26c8c83bf7ae119aa88af38f2954e57c3a4d553391b924f68bc6d WatchSource:0}: Error finding container fc7c312259b26c8c83bf7ae119aa88af38f2954e57c3a4d553391b924f68bc6d: Status 404 returned error can't find the container with id fc7c312259b26c8c83bf7ae119aa88af38f2954e57c3a4d553391b924f68bc6d Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.616747 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9a37ca14-4557-4c5c-b8b8-be6776516c83" (UID: "9a37ca14-4557-4c5c-b8b8-be6776516c83"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.656333 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.656374 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.656472 4747 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.656487 4747 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.656498 4747 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.656511 4747 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-77d8776c8f-zkk2b: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.656565 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift podName:f5bfe363-0d37-4380-93a8-dc7ea1ad3392 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:07.656548153 +0000 UTC m=+1378.123855641 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift") pod "swift-proxy-77d8776c8f-zkk2b" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.657996 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.659834 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.663396 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "9cc30174-a5ca-454e-b049-59a62d358d8a" (UID: "9cc30174-a5ca-454e-b049-59a62d358d8a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.672793 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.758787 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.759956 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.759972 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.759982 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cc30174-a5ca-454e-b049-59a62d358d8a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.768042 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.768705 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.780661 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.784185 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.802056 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7528af7d-8d58-4f20-ac03-b9b62e14c9e2" (UID: "7528af7d-8d58-4f20-ac03-b9b62e14c9e2"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.809947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837101 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6d835978-9804-4427-9dd4-48c40ad829c5/ovsdbserver-nb/0.log" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837140 4747 generic.go:334] "Generic (PLEG): container finished" podID="6d835978-9804-4427-9dd4-48c40ad829c5" containerID="57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305" exitCode=143 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837188 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerDied","Data":"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837219 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6d835978-9804-4427-9dd4-48c40ad829c5","Type":"ContainerDied","Data":"bd28baee269e8d9a60f943abf8a78a62b371dd3c28051c376c73adbefc3a406d"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837235 4747 scope.go:117] "RemoveContainer" containerID="63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.837349 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.838615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf40d0cc-0d14-4c55-986d-2809df27c4fd" (UID: "bf40d0cc-0d14-4c55-986d-2809df27c4fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.850483 4747 generic.go:334] "Generic (PLEG): container finished" podID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.851681 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" (UID: "8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.852343 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9a37ca14-4557-4c5c-b8b8-be6776516c83" (UID: "9a37ca14-4557-4c5c-b8b8-be6776516c83"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.856245 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a29110d-8922-4d83-97eb-7c12b0133b8d" path="/var/lib/kubelet/pods/0a29110d-8922-4d83-97eb-7c12b0133b8d/volumes" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.856740 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39885f1c-db92-4f28-a10c-36998e6fcda8" path="/var/lib/kubelet/pods/39885f1c-db92-4f28-a10c-36998e6fcda8/volumes" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.857199 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b42251a-f100-4233-91fa-be148b7c1665" path="/var/lib/kubelet/pods/8b42251a-f100-4233-91fa-be148b7c1665/volumes" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.858244 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a37ca14-4557-4c5c-b8b8-be6776516c83" path="/var/lib/kubelet/pods/9a37ca14-4557-4c5c-b8b8-be6776516c83/volumes" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.858760 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7" path="/var/lib/kubelet/pods/e3a789d1-bfc6-4d28-ac8a-ece9a5d700b7/volumes" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862058 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862144 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862201 4747 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862271 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862329 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862408 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a37ca14-4557-4c5c-b8b8-be6776516c83-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862471 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862533 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf40d0cc-0d14-4c55-986d-2809df27c4fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.862616 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7528af7d-8d58-4f20-ac03-b9b62e14c9e2-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.866289 4747 generic.go:334] "Generic (PLEG): container finished" podID="075d1135-1337-43d3-83e1-97b942a03786" containerID="b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.868680 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5jh4k_8bc768ee-26e3-43fb-b64d-7fc3f3c627c5/openstack-network-exporter/0.log" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.868765 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5jh4k" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.873528 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.879362 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6d835978-9804-4427-9dd4-48c40ad829c5" (UID: "6d835978-9804-4427-9dd4-48c40ad829c5"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.882174 4747 generic.go:334] "Generic (PLEG): container finished" podID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" containerID="f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.886137 4747 generic.go:334] "Generic (PLEG): container finished" podID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerID="0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.897044 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.906696 4747 generic.go:334] "Generic (PLEG): container finished" podID="53f0706f-507e-4360-b83e-9dbfacee3144" containerID="911dc2af5532c982dccb86d6364ac90a2fcc902c0d91f743b41d97c4cf784943" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.909616 4747 generic.go:334] "Generic (PLEG): container finished" podID="722451d6-f405-4bdc-b418-bfdaa2000197" containerID="f26116136a982eef2f31e0c94046144a04ef07ee2a4e9f404d513de905d257da" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.913017 4747 generic.go:334] "Generic (PLEG): container finished" podID="60728349-9aa2-474c-b935-6fc915822d4e" containerID="6359e7cae45a7f65faa12fd0c5d2077e88173e760b5b42b8279224e5571f8ba5" exitCode=0 Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.913088 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.913228 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.964834 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d835978-9804-4427-9dd4-48c40ad829c5-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.966161 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:03 crc kubenswrapper[4747]: E1205 21:05:03.966209 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data podName:70db507e-cc84-4722-8ac8-fd659c2803b8 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:07.966193477 +0000 UTC m=+1378.433500965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8") : configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985242 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerDied","Data":"5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985299 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd71-account-delete-gjncb" event={"ID":"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07","Type":"ContainerStarted","Data":"2932a232c6b2afc535d6d963d2f9fc066b0f730421b5d8318741cd8f7fdac370"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7680-account-delete-f82hm" event={"ID":"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a","Type":"ContainerStarted","Data":"fc7c312259b26c8c83bf7ae119aa88af38f2954e57c3a4d553391b924f68bc6d"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985323 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerDied","Data":"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985335 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement8da4-account-delete-8nv8m" event={"ID":"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0","Type":"ContainerStarted","Data":"29e991cbd1540f9f03f9daca034528d86992204b17c2accb6807748e5e7a771b"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5jh4k" event={"ID":"8bc768ee-26e3-43fb-b64d-7fc3f3c627c5","Type":"ContainerDied","Data":"088d5f09d3868f9a5e726a677de7de1455e6e908ad3a39b66769c1544d4587df"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi48b7-account-delete-25wjk" event={"ID":"bcb08da7-0b5d-437e-8882-4d352f3c2d47","Type":"ContainerStarted","Data":"0ef88f0d042cdeeced868c538d16dd641308c3e7831857605543318252f517de"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985378 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"82c51dab-4948-4ca3-94ba-c25cb3a4e280","Type":"ContainerDied","Data":"f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985392 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"82c51dab-4948-4ca3-94ba-c25cb3a4e280","Type":"ContainerDied","Data":"5aec99b6d9d29c377e6e7a9c1dc6f8b9354c00cfae8bf46f28d427c0337cae99"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985403 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aec99b6d9d29c377e6e7a9c1dc6f8b9354c00cfae8bf46f28d427c0337cae99" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerDied","Data":"0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985427 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77d8776c8f-zkk2b" event={"ID":"f5bfe363-0d37-4380-93a8-dc7ea1ad3392","Type":"ContainerDied","Data":"aed25399bf16cd8b385ed6b41772900ad4514904a7bede7aa26a69da1fa56f94"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985435 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed25399bf16cd8b385ed6b41772900ad4514904a7bede7aa26a69da1fa56f94" Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" event={"ID":"bf40d0cc-0d14-4c55-986d-2809df27c4fd","Type":"ContainerDied","Data":"4d28b6de58dd774c339108f10fcb306f62a9ef3ba472c445b80fa39ae8796875"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985453 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell033f5-account-delete-dk9n6" event={"ID":"51241ef0-cffe-40ba-bea9-1c01b7cc4184","Type":"ContainerStarted","Data":"a01582f4fdb4884d9667fdc3e8297aca14e43053509280408fb83e29abd93aed"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerDied","Data":"911dc2af5532c982dccb86d6364ac90a2fcc902c0d91f743b41d97c4cf784943"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985477 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancee01d-account-delete-6jnvw" event={"ID":"722451d6-f405-4bdc-b418-bfdaa2000197","Type":"ContainerDied","Data":"f26116136a982eef2f31e0c94046144a04ef07ee2a4e9f404d513de905d257da"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985492 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancee01d-account-delete-6jnvw" event={"ID":"722451d6-f405-4bdc-b418-bfdaa2000197","Type":"ContainerStarted","Data":"186eb09751cd1680db4e433988b982d71f3ecb87d804d1c30e7d0bc5de14615b"} Dec 05 21:05:03 crc kubenswrapper[4747]: I1205 21:05:03.985500 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder067d-account-delete-tcx7x" event={"ID":"60728349-9aa2-474c-b935-6fc915822d4e","Type":"ContainerDied","Data":"6359e7cae45a7f65faa12fd0c5d2077e88173e760b5b42b8279224e5571f8ba5"} Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.007751 4747 scope.go:117] "RemoveContainer" containerID="57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.007835 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.103866 4747 scope.go:117] "RemoveContainer" containerID="63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430" Dec 05 21:05:04 crc kubenswrapper[4747]: E1205 21:05:04.104233 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430\": container with ID starting with 63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430 not found: ID does not exist" containerID="63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.104263 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430"} err="failed to get container status \"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430\": rpc error: code = NotFound desc = could not find container \"63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430\": container with ID starting with 63ad63b88143db69d914307efba21c320e1a83fe77f4e07c2a03bf733e63c430 not found: ID does not exist" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.104284 4747 scope.go:117] "RemoveContainer" containerID="57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305" Dec 05 21:05:04 crc kubenswrapper[4747]: E1205 21:05:04.106993 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305\": container with ID starting with 57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305 not found: ID does not exist" containerID="57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.107018 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305"} err="failed to get container status \"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305\": rpc error: code = NotFound desc = could not find container \"57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305\": container with ID starting with 57da84d46bdfb0f78fea3ada748a543f9123b6b1b9676bdaef82d4fc2586a305 not found: ID does not exist" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.107033 4747 scope.go:117] "RemoveContainer" containerID="5e62b0a320297c30e0688db6c5b330d66aadcb57092284da879891e956764e37" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.117837 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.158998 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.173231 4747 scope.go:117] "RemoveContainer" containerID="50b55da7432690c6f59d396246de6b3e8a89b3871ba09c858ae1541632412c92" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.176435 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.183553 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-5jh4k"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.184042 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs\") pod \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.184248 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs\") pod \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.184409 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qszzt\" (UniqueName: \"kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt\") pod \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.184502 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data\") pod \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.184617 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle\") pod \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\" (UID: \"82c51dab-4948-4ca3-94ba-c25cb3a4e280\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.192457 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt" (OuterVolumeSpecName: "kube-api-access-qszzt") pod "82c51dab-4948-4ca3-94ba-c25cb3a4e280" (UID: "82c51dab-4948-4ca3-94ba-c25cb3a4e280"). InnerVolumeSpecName "kube-api-access-qszzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.233990 4747 scope.go:117] "RemoveContainer" containerID="ebc61242a667918cfc3cf1987e66b55a1c850c340da8529513e354533f3f843e" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.270882 4747 scope.go:117] "RemoveContainer" containerID="9b9b6204257884d5f197a391b88fab7960fa28bbc7447aca2b287e4eff66de65" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.287920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.287973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288004 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288034 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288056 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288142 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288159 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hr4z\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288232 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq5gw\" (UniqueName: \"kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288275 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288863 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288950 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs\") pod \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\" (UID: \"f5bfe363-0d37-4380-93a8-dc7ea1ad3392\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.288972 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.289007 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config\") pod \"53f0706f-507e-4360-b83e-9dbfacee3144\" (UID: \"53f0706f-507e-4360-b83e-9dbfacee3144\") " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.289384 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qszzt\" (UniqueName: \"kubernetes.io/projected/82c51dab-4948-4ca3-94ba-c25cb3a4e280-kube-api-access-qszzt\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.289820 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.290499 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.291099 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.294740 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.296136 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.301464 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.305928 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.317283 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-8wqh8"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.327426 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw" (OuterVolumeSpecName: "kube-api-access-wq5gw") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "kube-api-access-wq5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.330940 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z" (OuterVolumeSpecName: "kube-api-access-9hr4z") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "kube-api-access-9hr4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.349690 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.353393 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.366396 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.380460 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.386300 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.390919 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391195 4747 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391205 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391213 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391221 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391229 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391237 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hr4z\" (UniqueName: \"kubernetes.io/projected/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-kube-api-access-9hr4z\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391245 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq5gw\" (UniqueName: \"kubernetes.io/projected/53f0706f-507e-4360-b83e-9dbfacee3144-kube-api-access-wq5gw\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.391254 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f0706f-507e-4360-b83e-9dbfacee3144-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: E1205 21:05:04.391306 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 21:05:04 crc kubenswrapper[4747]: E1205 21:05:04.391350 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data podName:d49bd09d-90af-4f00-a333-0e292c581525 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:08.391336987 +0000 UTC m=+1378.858644475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data") pod "rabbitmq-server-0" (UID: "d49bd09d-90af-4f00-a333-0e292c581525") : configmap "rabbitmq-config-data" not found Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.402033 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.494797 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.744216 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.744538 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-central-agent" containerID="cri-o://fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b" gracePeriod=30 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.745022 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="proxy-httpd" containerID="cri-o://4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba" gracePeriod=30 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.745085 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="sg-core" containerID="cri-o://443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc" gracePeriod=30 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.745117 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-notification-agent" containerID="cri-o://39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac" gracePeriod=30 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.765428 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.765614 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="278246ea-04b9-4694-bb8c-5c67503c17e4" containerName="kube-state-metrics" containerID="cri-o://f2c7f412fda78eb2b77cfb8231215251635941d62951bcccf4806b1cdf12e6fe" gracePeriod=30 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.770874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data" (OuterVolumeSpecName: "config-data") pod "82c51dab-4948-4ca3-94ba-c25cb3a4e280" (UID: "82c51dab-4948-4ca3-94ba-c25cb3a4e280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.780114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82c51dab-4948-4ca3-94ba-c25cb3a4e280" (UID: "82c51dab-4948-4ca3-94ba-c25cb3a4e280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.806013 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.830401 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.830433 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.830553 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.832268 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.841716 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.863888 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.923706 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.936417 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.936449 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.936460 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.946179 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "82c51dab-4948-4ca3-94ba-c25cb3a4e280" (UID: "82c51dab-4948-4ca3-94ba-c25cb3a4e280"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.946307 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "82c51dab-4948-4ca3-94ba-c25cb3a4e280" (UID: "82c51dab-4948-4ca3-94ba-c25cb3a4e280"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.983924 4747 generic.go:334] "Generic (PLEG): container finished" podID="b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" containerID="3ffa54ae9df9dadb9e8a91bcd057746909c3a6d881ece4200e599099ef8599db" exitCode=0 Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.990794 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data" (OuterVolumeSpecName: "config-data") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:04 crc kubenswrapper[4747]: I1205 21:05:04.991890 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "53f0706f-507e-4360-b83e-9dbfacee3144" (UID: "53f0706f-507e-4360-b83e-9dbfacee3144"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:04.974836 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b64kk"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:04.995717 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b64kk"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:04.995744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd71-account-delete-gjncb" event={"ID":"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07","Type":"ContainerDied","Data":"3ffa54ae9df9dadb9e8a91bcd057746909c3a6d881ece4200e599099ef8599db"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.010996 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi48b7-account-delete-25wjk" event={"ID":"bcb08da7-0b5d-437e-8882-4d352f3c2d47","Type":"ContainerStarted","Data":"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.011593 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi48b7-account-delete-25wjk" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.023500 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tj4z5"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.034569 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f5bfe363-0d37-4380-93a8-dc7ea1ad3392" (UID: "f5bfe363-0d37-4380-93a8-dc7ea1ad3392"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.038530 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.038556 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bfe363-0d37-4380-93a8-dc7ea1ad3392-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.038566 4747 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f0706f-507e-4360-b83e-9dbfacee3144-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.038576 4747 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.038597 4747 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c51dab-4948-4ca3-94ba-c25cb3a4e280-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.055533 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-855575d477-mmmm7" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9696/\": dial tcp 10.217.0.153:9696: connect: connection refused" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.056155 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tj4z5"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.066646 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.073016 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.073235 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-79977d44cb-v5vtt" podUID="8be50198-f9c5-4e90-bfa0-d33b502278b7" containerName="keystone-api" containerID="cri-o://2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f" gracePeriod=30 Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.089954 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi48b7-account-delete-25wjk" podStartSLOduration=5.089932915 podStartE2EDuration="5.089932915s" podCreationTimestamp="2025-12-05 21:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:05:05.056758315 +0000 UTC m=+1375.524065793" watchObservedRunningTime="2025-12-05 21:05:05.089932915 +0000 UTC m=+1375.557240403" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.104966 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-j2b4b"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.106161 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell033f5-account-delete-dk9n6" event={"ID":"51241ef0-cffe-40ba-bea9-1c01b7cc4184","Type":"ContainerStarted","Data":"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.117683 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-j2b4b"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.122728 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7680-account-delete-f82hm" event={"ID":"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a","Type":"ContainerStarted","Data":"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.133688 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7650-account-create-update-qkm7g"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.136710 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell033f5-account-delete-dk9n6" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.141208 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7650-account-create-update-qkm7g"] Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.143917 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.143973 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:05.64395763 +0000 UTC m=+1376.111265108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.145119 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron7680-account-delete-f82hm" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.165135 4747 generic.go:334] "Generic (PLEG): container finished" podID="278246ea-04b9-4694-bb8c-5c67503c17e4" containerID="f2c7f412fda78eb2b77cfb8231215251635941d62951bcccf4806b1cdf12e6fe" exitCode=2 Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.165223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"278246ea-04b9-4694-bb8c-5c67503c17e4","Type":"ContainerDied","Data":"f2c7f412fda78eb2b77cfb8231215251635941d62951bcccf4806b1cdf12e6fe"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.181833 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k58mr"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.244674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement8da4-account-delete-8nv8m" event={"ID":"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0","Type":"ContainerStarted","Data":"bc88be3138ad806a1fad25c0703151fdaa867a84929b19340a4803a430b8c8b6"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.245303 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement8da4-account-delete-8nv8m" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.247017 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.247060 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:05.747048389 +0000 UTC m=+1376.214355867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.248332 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.248369 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:05.748359551 +0000 UTC m=+1376.215667039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.281939 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k58mr"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.308277 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.311875 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"53f0706f-507e-4360-b83e-9dbfacee3144","Type":"ContainerDied","Data":"9352a6f64d89b762559f344694b4b0d367d9deef68a621a56dcc9ba2bb269208"} Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.311927 4747 scope.go:117] "RemoveContainer" containerID="911dc2af5532c982dccb86d6364ac90a2fcc902c0d91f743b41d97c4cf784943" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.312201 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" containerName="memcached" containerID="cri-o://2cb80132fac67dd10f92c4e06ec0d58fbd16b1a14243d5a1da9696abdd3d83b7" gracePeriod=30 Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.312435 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77d8776c8f-zkk2b" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.313387 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.328022 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.338629 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell033f5-account-delete-dk9n6" podStartSLOduration=5.3385315 podStartE2EDuration="5.3385315s" podCreationTimestamp="2025-12-05 21:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:05:05.14836998 +0000 UTC m=+1375.615677468" watchObservedRunningTime="2025-12-05 21:05:05.3385315 +0000 UTC m=+1375.805838988" Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.357079 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.357129 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:05.85711548 +0000 UTC m=+1376.324422968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.417702 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-067d-account-create-update-jhm5h"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.430079 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-067d-account-create-update-jhm5h"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.435380 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron7680-account-delete-f82hm" podStartSLOduration=5.435363314 podStartE2EDuration="5.435363314s" podCreationTimestamp="2025-12-05 21:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:05:05.19934475 +0000 UTC m=+1375.666652238" watchObservedRunningTime="2025-12-05 21:05:05.435363314 +0000 UTC m=+1375.902670802" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.476696 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6w6g9"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.526950 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:46202->10.217.0.198:8775: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.527284 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:46206->10.217.0.198:8775: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.527356 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": read tcp 10.217.0.2:44382->10.217.0.175:9292: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.527427 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": read tcp 10.217.0.2:44398->10.217.0.175:9292: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.529399 4747 scope.go:117] "RemoveContainer" containerID="a88a3625221a5fdb83dd406f3af85bff0a37692424ce8b21edd35f8c6077f232" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.530826 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6w6g9"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.559470 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9292/healthcheck\": read tcp 10.217.0.2:55142->10.217.0.176:9292: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.559470 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.176:9292/healthcheck\": read tcp 10.217.0.2:55132->10.217.0.176:9292: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.561612 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8da4-account-create-update-llwr4"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.675777 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76c7dd97d4-lfdpd" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:42350->10.217.0.159:9311: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.675789 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76c7dd97d4-lfdpd" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:42358->10.217.0.159:9311: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.675972 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8da4-account-create-update-llwr4"] Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.687251 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.687318 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:06.687296562 +0000 UTC m=+1377.154604050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.698536 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement8da4-account-delete-8nv8m" podStartSLOduration=5.698517659 podStartE2EDuration="5.698517659s" podCreationTimestamp="2025-12-05 21:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:05:05.294783869 +0000 UTC m=+1375.762091357" watchObservedRunningTime="2025-12-05 21:05:05.698517659 +0000 UTC m=+1376.165825147" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.699452 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.706147 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="galera" containerID="cri-o://49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421" gracePeriod=30 Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.753651 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mcpnx"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.765553 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mcpnx"] Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.785033 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.785303 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e01d-account-create-update-jw9bz"] Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.785276 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.785351 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:06.785335885 +0000 UTC m=+1377.252643373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.785391 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:06.785373006 +0000 UTC m=+1377.252680484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.796092 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e01d-account-create-update-jw9bz"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.810906 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.819780 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sx5d9"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.827831 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sx5d9"] Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.889898 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: E1205 21:05:05.889957 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:06.889943911 +0000 UTC m=+1377.357251399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.895206 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281b36e4-63e9-473f-a0bf-0ad62a307577" path="/var/lib/kubelet/pods/281b36e4-63e9-473f-a0bf-0ad62a307577/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.901386 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2" path="/var/lib/kubelet/pods/46c230fe-1ac3-4ed2-8582-d94a5d0bf9c2/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.910744 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb214ab-734f-4671-9959-6f424a9f6ade" path="/var/lib/kubelet/pods/4cb214ab-734f-4671-9959-6f424a9f6ade/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.911265 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbdb950-0501-41c3-a040-f8275f8c8d29" path="/var/lib/kubelet/pods/4dbdb950-0501-41c3-a040-f8275f8c8d29/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.916103 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" path="/var/lib/kubelet/pods/6d835978-9804-4427-9dd4-48c40ad829c5/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.918047 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" path="/var/lib/kubelet/pods/7528af7d-8d58-4f20-ac03-b9b62e14c9e2/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.918642 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2a59e8-2466-4c41-93c5-d88075dfbddf" path="/var/lib/kubelet/pods/7f2a59e8-2466-4c41-93c5-d88075dfbddf/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.919132 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84eeb352-63cc-4fbf-b328-0911c3f67abf" path="/var/lib/kubelet/pods/84eeb352-63cc-4fbf-b328-0911c3f67abf/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.922818 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" path="/var/lib/kubelet/pods/8bc768ee-26e3-43fb-b64d-7fc3f3c627c5/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.923496 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9363606d-f1e1-4ba6-aca3-155bb6b57473" path="/var/lib/kubelet/pods/9363606d-f1e1-4ba6-aca3-155bb6b57473/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.927543 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996767c6-2a35-4b93-a779-8803b52eff5c" path="/var/lib/kubelet/pods/996767c6-2a35-4b93-a779-8803b52eff5c/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.928096 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68d9420-28ed-4cf0-a8c7-774600d5436e" path="/var/lib/kubelet/pods/b68d9420-28ed-4cf0-a8c7-774600d5436e/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.928557 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" path="/var/lib/kubelet/pods/bf40d0cc-0d14-4c55-986d-2809df27c4fd/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.937159 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ac9306-6af1-4745-ba77-046356cdd7fd" path="/var/lib/kubelet/pods/c0ac9306-6af1-4745-ba77-046356cdd7fd/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.937865 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f0bf6d-ce1c-48c5-8866-a65a30022ca3" path="/var/lib/kubelet/pods/f5f0bf6d-ce1c-48c5-8866-a65a30022ca3/volumes" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.938405 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.938436 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-77d8776c8f-zkk2b"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.938451 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.938463 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cd71-account-create-update-f4fch"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.938474 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cd71-account-create-update-f4fch"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.942284 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.956536 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.959937 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:43936->10.217.0.163:8776: read: connection reset by peer" Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.965506 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rhvjl"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.971025 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rhvjl"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.983419 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7680-account-create-update-2vfwq"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.990289 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:05 crc kubenswrapper[4747]: I1205 21:05:05.995529 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7680-account-create-update-2vfwq"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.002302 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.008756 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.012211 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6gvhk"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.018204 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6gvhk"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.030263 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-48b7-account-create-update-cf96q"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.036634 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.048878 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-48b7-account-create-update-cf96q"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.054980 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5kkvt"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.064381 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5kkvt"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.067336 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.073787 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-33f5-account-create-update-dhmm4"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.080058 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-33f5-account-create-update-dhmm4"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.114182 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.116637 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.117566 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.117625 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.129490 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.140548 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.142227 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.182943 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.183004 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.204157 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.223090 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.223137 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.223186 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.233861 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.233960 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642" gracePeriod=600 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.240442 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" probeResult="failure" output=< Dec 05 21:05:06 crc kubenswrapper[4747]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 05 21:05:06 crc kubenswrapper[4747]: > Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.300696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle\") pod \"278246ea-04b9-4694-bb8c-5c67503c17e4\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.301508 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts\") pod \"60728349-9aa2-474c-b935-6fc915822d4e\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.301538 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsghz\" (UniqueName: \"kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz\") pod \"278246ea-04b9-4694-bb8c-5c67503c17e4\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.301571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lr48\" (UniqueName: \"kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48\") pod \"60728349-9aa2-474c-b935-6fc915822d4e\" (UID: \"60728349-9aa2-474c-b935-6fc915822d4e\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.301604 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config\") pod \"278246ea-04b9-4694-bb8c-5c67503c17e4\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.301689 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs\") pod \"278246ea-04b9-4694-bb8c-5c67503c17e4\" (UID: \"278246ea-04b9-4694-bb8c-5c67503c17e4\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.305161 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60728349-9aa2-474c-b935-6fc915822d4e" (UID: "60728349-9aa2-474c-b935-6fc915822d4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.308093 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz" (OuterVolumeSpecName: "kube-api-access-tsghz") pod "278246ea-04b9-4694-bb8c-5c67503c17e4" (UID: "278246ea-04b9-4694-bb8c-5c67503c17e4"). InnerVolumeSpecName "kube-api-access-tsghz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.308452 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48" (OuterVolumeSpecName: "kube-api-access-4lr48") pod "60728349-9aa2-474c-b935-6fc915822d4e" (UID: "60728349-9aa2-474c-b935-6fc915822d4e"). InnerVolumeSpecName "kube-api-access-4lr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.326923 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"278246ea-04b9-4694-bb8c-5c67503c17e4","Type":"ContainerDied","Data":"e2e4cecdbccdb96c0495fdf5142054930924e66fad5706bed33bed13173f721b"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.326979 4747 scope.go:117] "RemoveContainer" containerID="f2c7f412fda78eb2b77cfb8231215251635941d62951bcccf4806b1cdf12e6fe" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.327070 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.332355 4747 generic.go:334] "Generic (PLEG): container finished" podID="03e0ca3b-083d-477b-a227-dc70e5204444" containerID="2cb80132fac67dd10f92c4e06ec0d58fbd16b1a14243d5a1da9696abdd3d83b7" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.332415 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03e0ca3b-083d-477b-a227-dc70e5204444","Type":"ContainerDied","Data":"2cb80132fac67dd10f92c4e06ec0d58fbd16b1a14243d5a1da9696abdd3d83b7"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.334436 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "278246ea-04b9-4694-bb8c-5c67503c17e4" (UID: "278246ea-04b9-4694-bb8c-5c67503c17e4"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.334866 4747 generic.go:334] "Generic (PLEG): container finished" podID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerID="da5fc8501fba7bc707db385bce6aba964f09ddb107d30488526ab95f96cb2e21" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.334946 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerDied","Data":"da5fc8501fba7bc707db385bce6aba964f09ddb107d30488526ab95f96cb2e21"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.339221 4747 generic.go:334] "Generic (PLEG): container finished" podID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerID="1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.339290 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9","Type":"ContainerDied","Data":"1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.341397 4747 generic.go:334] "Generic (PLEG): container finished" podID="62399e6a-577a-4d10-b057-49c4bae7a172" containerID="7ad7169da0407cc9fa1fb2412f29f1fddcb02d5a765a42193125782a41cbb169" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.341451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerDied","Data":"7ad7169da0407cc9fa1fb2412f29f1fddcb02d5a765a42193125782a41cbb169"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.344836 4747 generic.go:334] "Generic (PLEG): container finished" podID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerID="b4ea3b3b35eb186a6415e9ce4ffe9c01f8217ca9dfbb6942810043a66549038a" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.344926 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerDied","Data":"b4ea3b3b35eb186a6415e9ce4ffe9c01f8217ca9dfbb6942810043a66549038a"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.345434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "278246ea-04b9-4694-bb8c-5c67503c17e4" (UID: "278246ea-04b9-4694-bb8c-5c67503c17e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.348020 4747 generic.go:334] "Generic (PLEG): container finished" podID="70527d7e-feb4-4821-b20d-74d9634ab124" containerID="a0b910eeeb5d4311cefe6de469772978a62f3bc013456ed000b893110cb13617" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.348097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerDied","Data":"a0b910eeeb5d4311cefe6de469772978a62f3bc013456ed000b893110cb13617"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.354707 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder067d-account-delete-tcx7x" event={"ID":"60728349-9aa2-474c-b935-6fc915822d4e","Type":"ContainerDied","Data":"3ed26b9f21173682f330768516bbaab79a529d29a49fdb0d3221bf16c57a0d4c"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.354741 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ed26b9f21173682f330768516bbaab79a529d29a49fdb0d3221bf16c57a0d4c" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.354795 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder067d-account-delete-tcx7x" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.359875 4747 generic.go:334] "Generic (PLEG): container finished" podID="ac080b76-32d3-498c-9832-d31494c1d21f" containerID="1bc4fa29702116af6fcd59bcbbca74418678e340a4d1a0b7cd112308e8bc0b4e" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.359938 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerDied","Data":"1bc4fa29702116af6fcd59bcbbca74418678e340a4d1a0b7cd112308e8bc0b4e"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.362187 4747 generic.go:334] "Generic (PLEG): container finished" podID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerID="90486f36ff75a740556644fedc7ad1ada74d9386b6373ae4001a5f28f3f94e44" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.362238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerDied","Data":"90486f36ff75a740556644fedc7ad1ada74d9386b6373ae4001a5f28f3f94e44"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369419 4747 generic.go:334] "Generic (PLEG): container finished" podID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerID="4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369452 4747 generic.go:334] "Generic (PLEG): container finished" podID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerID="443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc" exitCode=2 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369462 4747 generic.go:334] "Generic (PLEG): container finished" podID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerID="fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369507 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerDied","Data":"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerDied","Data":"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.369546 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerDied","Data":"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.373829 4747 generic.go:334] "Generic (PLEG): container finished" podID="466afd16-6e7d-42fd-bd82-cabab660b344" containerID="450bad02541ca9f2045559c849365b5204f10379fb98f7f20ee63d82eaedc639" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.373905 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerDied","Data":"450bad02541ca9f2045559c849365b5204f10379fb98f7f20ee63d82eaedc639"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.376273 4747 generic.go:334] "Generic (PLEG): container finished" podID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerID="bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" exitCode=0 Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.376336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ee758f70-0c00-471e-85bf-2d4a96646d15","Type":"ContainerDied","Data":"bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e"} Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.377126 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron7680-account-delete-f82hm" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.378417 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement8da4-account-delete-8nv8m" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.378880 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi48b7-account-delete-25wjk" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.379544 4747 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell033f5-account-delete-dk9n6" secret="" err="secret \"galera-openstack-dockercfg-kf8mz\" not found" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.391649 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "278246ea-04b9-4694-bb8c-5c67503c17e4" (UID: "278246ea-04b9-4694-bb8c-5c67503c17e4"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.404166 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5 is running failed: container process not found" containerID="1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404447 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404470 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60728349-9aa2-474c-b935-6fc915822d4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404482 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsghz\" (UniqueName: \"kubernetes.io/projected/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-api-access-tsghz\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404496 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lr48\" (UniqueName: \"kubernetes.io/projected/60728349-9aa2-474c-b935-6fc915822d4e-kube-api-access-4lr48\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404508 4747 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.404523 4747 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/278246ea-04b9-4694-bb8c-5c67503c17e4-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.418752 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5 is running failed: container process not found" containerID="1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.448544 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5 is running failed: container process not found" containerID="1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.448623 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerName="nova-scheduler-scheduler" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.464002 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e is running failed: container process not found" containerID="bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.466135 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e is running failed: container process not found" containerID="bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.466833 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e is running failed: container process not found" containerID="bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.467249 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerName="nova-cell1-conductor-conductor" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.473087 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.480703 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.503708 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.511918 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder067d-account-delete-tcx7x"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.613524 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle\") pod \"03e0ca3b-083d-477b-a227-dc70e5204444\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635069 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7x9q\" (UniqueName: \"kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q\") pod \"03e0ca3b-083d-477b-a227-dc70e5204444\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42hwk\" (UniqueName: \"kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk\") pod \"722451d6-f405-4bdc-b418-bfdaa2000197\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635176 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs\") pod \"03e0ca3b-083d-477b-a227-dc70e5204444\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635372 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data\") pod \"03e0ca3b-083d-477b-a227-dc70e5204444\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635459 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config\") pod \"03e0ca3b-083d-477b-a227-dc70e5204444\" (UID: \"03e0ca3b-083d-477b-a227-dc70e5204444\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.635492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts\") pod \"722451d6-f405-4bdc-b418-bfdaa2000197\" (UID: \"722451d6-f405-4bdc-b418-bfdaa2000197\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.636991 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data" (OuterVolumeSpecName: "config-data") pod "03e0ca3b-083d-477b-a227-dc70e5204444" (UID: "03e0ca3b-083d-477b-a227-dc70e5204444"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.637983 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "03e0ca3b-083d-477b-a227-dc70e5204444" (UID: "03e0ca3b-083d-477b-a227-dc70e5204444"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.638000 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "722451d6-f405-4bdc-b418-bfdaa2000197" (UID: "722451d6-f405-4bdc-b418-bfdaa2000197"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.639379 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q" (OuterVolumeSpecName: "kube-api-access-p7x9q") pod "03e0ca3b-083d-477b-a227-dc70e5204444" (UID: "03e0ca3b-083d-477b-a227-dc70e5204444"). InnerVolumeSpecName "kube-api-access-p7x9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.639761 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.639783 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03e0ca3b-083d-477b-a227-dc70e5204444-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.639795 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722451d6-f405-4bdc-b418-bfdaa2000197-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.639805 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7x9q\" (UniqueName: \"kubernetes.io/projected/03e0ca3b-083d-477b-a227-dc70e5204444-kube-api-access-p7x9q\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.641322 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk" (OuterVolumeSpecName: "kube-api-access-42hwk") pod "722451d6-f405-4bdc-b418-bfdaa2000197" (UID: "722451d6-f405-4bdc-b418-bfdaa2000197"). InnerVolumeSpecName "kube-api-access-42hwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.672361 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03e0ca3b-083d-477b-a227-dc70e5204444" (UID: "03e0ca3b-083d-477b-a227-dc70e5204444"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.723488 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "03e0ca3b-083d-477b-a227-dc70e5204444" (UID: "03e0ca3b-083d-477b-a227-dc70e5204444"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.744847 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.744876 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42hwk\" (UniqueName: \"kubernetes.io/projected/722451d6-f405-4bdc-b418-bfdaa2000197-kube-api-access-42hwk\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.744886 4747 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e0ca3b-083d-477b-a227-dc70e5204444-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.744946 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.744990 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:08.744976308 +0000 UTC m=+1379.212283796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.821758 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.846862 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.847110 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:08.847096972 +0000 UTC m=+1379.314404460 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.847728 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.847882 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:08.847865461 +0000 UTC m=+1379.315172949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.853503 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646c94b678-5975b" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.881661 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.884615 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.903801 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.910089 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.911289 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.924391 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.933265 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.949076 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.949917 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.949962 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.949990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.950025 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.950039 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.950086 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.950103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.950204 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjnzm\" (UniqueName: \"kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.952512 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs" (OuterVolumeSpecName: "logs") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953679 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6c8w\" (UniqueName: \"kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953726 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953749 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data\") pod \"ac080b76-32d3-498c-9832-d31494c1d21f\" (UID: \"ac080b76-32d3-498c-9832-d31494c1d21f\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953782 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953835 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.953891 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs\") pod \"466afd16-6e7d-42fd-bd82-cabab660b344\" (UID: \"466afd16-6e7d-42fd-bd82-cabab660b344\") " Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.955055 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.955075 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac080b76-32d3-498c-9832-d31494c1d21f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.956631 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: E1205 21:05:06.956704 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:08.956683601 +0000 UTC m=+1379.423991149 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.957103 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.957459 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm" (OuterVolumeSpecName: "kube-api-access-vjnzm") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "kube-api-access-vjnzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.957834 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs" (OuterVolumeSpecName: "logs") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.981076 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w" (OuterVolumeSpecName: "kube-api-access-f6c8w") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "kube-api-access-f6c8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.981223 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts" (OuterVolumeSpecName: "scripts") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:06 crc kubenswrapper[4747]: I1205 21:05:06.983375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts" (OuterVolumeSpecName: "scripts") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.042028 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.043469 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data" (OuterVolumeSpecName: "config-data") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056065 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056124 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056144 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056165 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data\") pod \"70527d7e-feb4-4821-b20d-74d9634ab124\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056205 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgff4\" (UniqueName: \"kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4\") pod \"ee758f70-0c00-471e-85bf-2d4a96646d15\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056232 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnsdv\" (UniqueName: \"kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056265 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle\") pod \"70527d7e-feb4-4821-b20d-74d9634ab124\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056280 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056306 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs\") pod \"70527d7e-feb4-4821-b20d-74d9634ab124\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056327 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67vp\" (UniqueName: \"kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056357 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8g9p\" (UniqueName: \"kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056378 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056404 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056422 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056442 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056458 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056500 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056516 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056536 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056556 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq882\" (UniqueName: \"kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882\") pod \"70527d7e-feb4-4821-b20d-74d9634ab124\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056570 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056596 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056621 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs\") pod \"70527d7e-feb4-4821-b20d-74d9634ab124\" (UID: \"70527d7e-feb4-4821-b20d-74d9634ab124\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056655 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle\") pod \"ee758f70-0c00-471e-85bf-2d4a96646d15\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056675 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data\") pod \"62399e6a-577a-4d10-b057-49c4bae7a172\" (UID: \"62399e6a-577a-4d10-b057-49c4bae7a172\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056698 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056714 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data\") pod \"ee758f70-0c00-471e-85bf-2d4a96646d15\" (UID: \"ee758f70-0c00-471e-85bf-2d4a96646d15\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056730 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs\") pod \"7d3c569a-33c9-46a0-8461-e69315fbd20b\" (UID: \"7d3c569a-33c9-46a0-8461-e69315fbd20b\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.056756 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs\") pod \"60c19006-e7b7-4c36-847a-a52358ae6a99\" (UID: \"60c19006-e7b7-4c36-847a-a52358ae6a99\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057084 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057095 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjnzm\" (UniqueName: \"kubernetes.io/projected/466afd16-6e7d-42fd-bd82-cabab660b344-kube-api-access-vjnzm\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057105 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6c8w\" (UniqueName: \"kubernetes.io/projected/ac080b76-32d3-498c-9832-d31494c1d21f-kube-api-access-f6c8w\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057115 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057123 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057132 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466afd16-6e7d-42fd-bd82-cabab660b344-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057149 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.057157 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.060171 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs" (OuterVolumeSpecName: "logs") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.060748 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs" (OuterVolumeSpecName: "logs") pod "70527d7e-feb4-4821-b20d-74d9634ab124" (UID: "70527d7e-feb4-4821-b20d-74d9634ab124"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.061196 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs" (OuterVolumeSpecName: "logs") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.061754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs" (OuterVolumeSpecName: "logs") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.072152 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data" (OuterVolumeSpecName: "config-data") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.073907 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882" (OuterVolumeSpecName: "kube-api-access-nq882") pod "70527d7e-feb4-4821-b20d-74d9634ab124" (UID: "70527d7e-feb4-4821-b20d-74d9634ab124"). InnerVolumeSpecName "kube-api-access-nq882". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.076410 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.076473 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.088055 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4" (OuterVolumeSpecName: "kube-api-access-dgff4") pod "ee758f70-0c00-471e-85bf-2d4a96646d15" (UID: "ee758f70-0c00-471e-85bf-2d4a96646d15"). InnerVolumeSpecName "kube-api-access-dgff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.088143 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.125930 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp" (OuterVolumeSpecName: "kube-api-access-v67vp") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "kube-api-access-v67vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.128855 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv" (OuterVolumeSpecName: "kube-api-access-bnsdv") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "kube-api-access-bnsdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.135754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.136091 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ac080b76-32d3-498c-9832-d31494c1d21f" (UID: "ac080b76-32d3-498c-9832-d31494c1d21f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.136244 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p" (OuterVolumeSpecName: "kube-api-access-k8g9p") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "kube-api-access-k8g9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.140440 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts" (OuterVolumeSpecName: "scripts") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.154195 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.158698 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67vp\" (UniqueName: \"kubernetes.io/projected/62399e6a-577a-4d10-b057-49c4bae7a172-kube-api-access-v67vp\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.158794 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8g9p\" (UniqueName: \"kubernetes.io/projected/7d3c569a-33c9-46a0-8461-e69315fbd20b-kube-api-access-k8g9p\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.158865 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.158923 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62399e6a-577a-4d10-b057-49c4bae7a172-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.158976 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159035 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159097 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac080b76-32d3-498c-9832-d31494c1d21f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159152 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq882\" (UniqueName: \"kubernetes.io/projected/70527d7e-feb4-4821-b20d-74d9634ab124-kube-api-access-nq882\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159294 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60c19006-e7b7-4c36-847a-a52358ae6a99-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159747 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70527d7e-feb4-4821-b20d-74d9634ab124-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159814 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159896 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3c569a-33c9-46a0-8461-e69315fbd20b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.159957 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.160020 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62399e6a-577a-4d10-b057-49c4bae7a172-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.160076 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgff4\" (UniqueName: \"kubernetes.io/projected/ee758f70-0c00-471e-85bf-2d4a96646d15-kube-api-access-dgff4\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.160185 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnsdv\" (UniqueName: \"kubernetes.io/projected/60c19006-e7b7-4c36-847a-a52358ae6a99-kube-api-access-bnsdv\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.177738 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.213223 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.219997 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.231140 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.246989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267458 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgwzn\" (UniqueName: \"kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn\") pod \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267638 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267744 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle\") pod \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267806 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267848 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267941 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.267982 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqz5t\" (UniqueName: \"kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268004 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data\") pod \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\" (UID: \"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268054 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268078 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data\") pod \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\" (UID: \"33e4dab2-e2a7-4fe3-949a-6a31460e11ba\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268574 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268616 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.268629 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.270211 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs" (OuterVolumeSpecName: "logs") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.270474 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.271764 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70527d7e-feb4-4821-b20d-74d9634ab124" (UID: "70527d7e-feb4-4821-b20d-74d9634ab124"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.277945 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts" (OuterVolumeSpecName: "scripts") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.279711 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t" (OuterVolumeSpecName: "kube-api-access-xqz5t") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "kube-api-access-xqz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.285436 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.287495 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.301917 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn" (OuterVolumeSpecName: "kube-api-access-mgwzn") pod "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" (UID: "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9"). InnerVolumeSpecName "kube-api-access-mgwzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.342008 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee758f70-0c00-471e-85bf-2d4a96646d15" (UID: "ee758f70-0c00-471e-85bf-2d4a96646d15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.349937 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.370529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts\") pod \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.370596 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gkkr\" (UniqueName: \"kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr\") pod \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\" (UID: \"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07\") " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371211 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371234 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371244 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371256 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371269 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371280 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqz5t\" (UniqueName: \"kubernetes.io/projected/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-kube-api-access-xqz5t\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371302 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371311 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371319 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.371329 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgwzn\" (UniqueName: \"kubernetes.io/projected/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-kube-api-access-mgwzn\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.372356 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" (UID: "b2f34a29-aa43-46a7-9fa8-8eead0c7ba07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.395526 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c7dd97d4-lfdpd" event={"ID":"7d3c569a-33c9-46a0-8461-e69315fbd20b","Type":"ContainerDied","Data":"421b7283cd0cebcf9d4eddc8c84a78ffccc619d963196165ccc2c347799a9b4c"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.395635 4747 scope.go:117] "RemoveContainer" containerID="da5fc8501fba7bc707db385bce6aba964f09ddb107d30488526ab95f96cb2e21" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.395740 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c7dd97d4-lfdpd" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.396120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr" (OuterVolumeSpecName: "kube-api-access-9gkkr") pod "b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" (UID: "b2f34a29-aa43-46a7-9fa8-8eead0c7ba07"). InnerVolumeSpecName "kube-api-access-9gkkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.400981 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60c19006-e7b7-4c36-847a-a52358ae6a99","Type":"ContainerDied","Data":"2c84aa6a831fb3f2d3ce2ddc737b5bfc3edfce5147112056eced45bfd85b4b4d"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.402760 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.408768 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data" (OuterVolumeSpecName: "config-data") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.409675 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data" (OuterVolumeSpecName: "config-data") pod "ee758f70-0c00-471e-85bf-2d4a96646d15" (UID: "ee758f70-0c00-471e-85bf-2d4a96646d15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.411428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"33e4dab2-e2a7-4fe3-949a-6a31460e11ba","Type":"ContainerDied","Data":"1d8195deffeb63525fad9efc5602c4d8f4f7af6ef7d284a70972270a3ff3071c"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.411927 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.417692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70527d7e-feb4-4821-b20d-74d9634ab124","Type":"ContainerDied","Data":"15b3ca332b9ebb3bd1e9a69ad76916c072b50d1a01360ce6e7c84bdb064b0fb6"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.417706 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.418327 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data" (OuterVolumeSpecName: "config-data") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.419368 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd71-account-delete-gjncb" event={"ID":"b2f34a29-aa43-46a7-9fa8-8eead0c7ba07","Type":"ContainerDied","Data":"2932a232c6b2afc535d6d963d2f9fc066b0f730421b5d8318741cd8f7fdac370"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.419394 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2932a232c6b2afc535d6d963d2f9fc066b0f730421b5d8318741cd8f7fdac370" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.419446 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd71-account-delete-gjncb" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.428715 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642" exitCode=0 Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.428774 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.428799 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.430117 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.430183 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ee758f70-0c00-471e-85bf-2d4a96646d15","Type":"ContainerDied","Data":"1ab9cd12be84961b02a69bd0db8327f1366119f08032f65fa294552f7ceb9d94"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.432924 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancee01d-account-delete-6jnvw" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.433196 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancee01d-account-delete-6jnvw" event={"ID":"722451d6-f405-4bdc-b418-bfdaa2000197","Type":"ContainerDied","Data":"186eb09751cd1680db4e433988b982d71f3ecb87d804d1c30e7d0bc5de14615b"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.433215 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186eb09751cd1680db4e433988b982d71f3ecb87d804d1c30e7d0bc5de14615b" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.447998 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac080b76-32d3-498c-9832-d31494c1d21f","Type":"ContainerDied","Data":"a854187f0d7db09bf0941f51b4103aa49e8faef8fa46e11ee0262bf1e1319006"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.448634 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.449627 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.459083 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62399e6a-577a-4d10-b057-49c4bae7a172","Type":"ContainerDied","Data":"d0780bb0583af844986dba554f2c952c6bf113147c39f365faca7c60be6fb605"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.459130 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.462804 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data" (OuterVolumeSpecName: "config-data") pod "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" (UID: "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477283 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477315 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477324 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gkkr\" (UniqueName: \"kubernetes.io/projected/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07-kube-api-access-9gkkr\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477333 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477341 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477350 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee758f70-0c00-471e-85bf-2d4a96646d15-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.477360 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.491163 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.492785 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646c94b678-5975b" event={"ID":"466afd16-6e7d-42fd-bd82-cabab660b344","Type":"ContainerDied","Data":"2007e611b3a8556fb4e9c90464ab69dd6ee17db9a61483225035a082c5539bb7"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.492857 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646c94b678-5975b" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.501104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b545cc7-5bb8-499c-83da-93fe7cbbd6f9","Type":"ContainerDied","Data":"ff16ee1bcbefc5b718ca75caa07bd7eb87765d146569669e8a795a2be1beefd2"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.501317 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.504998 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.506569 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi48b7-account-delete-25wjk" podUID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" containerName="mariadb-account-delete" containerID="cri-o://b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9" gracePeriod=30 Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.507640 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.513077 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03e0ca3b-083d-477b-a227-dc70e5204444","Type":"ContainerDied","Data":"2b2ea54234df653259ae3f1cf08da1e167803b15d41d767e0a0595025d6d609c"} Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.513269 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron7680-account-delete-f82hm" podUID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" containerName="mariadb-account-delete" containerID="cri-o://cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d" gracePeriod=30 Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.513442 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement8da4-account-delete-8nv8m" podUID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" containerName="mariadb-account-delete" containerID="cri-o://bc88be3138ad806a1fad25c0703151fdaa867a84929b19340a4803a430b8c8b6" gracePeriod=30 Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.513545 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell033f5-account-delete-dk9n6" podUID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" containerName="mariadb-account-delete" containerID="cri-o://d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29" gracePeriod=30 Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.541488 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "466afd16-6e7d-42fd-bd82-cabab660b344" (UID: "466afd16-6e7d-42fd-bd82-cabab660b344"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.566495 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data" (OuterVolumeSpecName: "config-data") pod "70527d7e-feb4-4821-b20d-74d9634ab124" (UID: "70527d7e-feb4-4821-b20d-74d9634ab124"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.579112 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.579152 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.579165 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.579178 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466afd16-6e7d-42fd-bd82-cabab660b344-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.581639 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data" (OuterVolumeSpecName: "config-data") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.581784 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.603788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.611723 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data" (OuterVolumeSpecName: "config-data") pod "7d3c569a-33c9-46a0-8461-e69315fbd20b" (UID: "7d3c569a-33c9-46a0-8461-e69315fbd20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.618083 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.625971 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" (UID: "2b545cc7-5bb8-499c-83da-93fe7cbbd6f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.627740 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62399e6a-577a-4d10-b057-49c4bae7a172" (UID: "62399e6a-577a-4d10-b057-49c4bae7a172"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.650949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70527d7e-feb4-4821-b20d-74d9634ab124" (UID: "70527d7e-feb4-4821-b20d-74d9634ab124"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.652180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60c19006-e7b7-4c36-847a-a52358ae6a99" (UID: "60c19006-e7b7-4c36-847a-a52358ae6a99"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.676817 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33e4dab2-e2a7-4fe3-949a-6a31460e11ba" (UID: "33e4dab2-e2a7-4fe3-949a-6a31460e11ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680488 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680519 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680534 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70527d7e-feb4-4821-b20d-74d9634ab124-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680546 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680557 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c19006-e7b7-4c36-847a-a52358ae6a99-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680568 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680592 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680602 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62399e6a-577a-4d10-b057-49c4bae7a172-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680610 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e4dab2-e2a7-4fe3-949a-6a31460e11ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.680618 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3c569a-33c9-46a0-8461-e69315fbd20b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.733511 4747 scope.go:117] "RemoveContainer" containerID="e81f5c9fbda1eb9ab9778805ee52bf5015c30d3532093b0e7bc22a3a44f23a55" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.808174 4747 scope.go:117] "RemoveContainer" containerID="b4ea3b3b35eb186a6415e9ce4ffe9c01f8217ca9dfbb6942810043a66549038a" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.810551 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.829224 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicancd71-account-delete-gjncb"] Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.859390 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9cbcb645-8wqh8" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Dec 05 21:05:07 crc kubenswrapper[4747]: E1205 21:05:07.988243 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:07 crc kubenswrapper[4747]: E1205 21:05:07.988315 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data podName:70db507e-cc84-4722-8ac8-fd659c2803b8 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:15.988297682 +0000 UTC m=+1386.455605170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8") : configmap "rabbitmq-cell1-config-data" not found Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.992997 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40" path="/var/lib/kubelet/pods/0119fc0d-6cd2-4aa8-aef8-4ab979e9ca40/volumes" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.993871 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278246ea-04b9-4694-bb8c-5c67503c17e4" path="/var/lib/kubelet/pods/278246ea-04b9-4694-bb8c-5c67503c17e4/volumes" Dec 05 21:05:07 crc kubenswrapper[4747]: I1205 21:05:07.994627 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362c3d2b-5405-46e3-aa99-90f7b010dfd3" path="/var/lib/kubelet/pods/362c3d2b-5405-46e3-aa99-90f7b010dfd3/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.003537 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" path="/var/lib/kubelet/pods/53f0706f-507e-4360-b83e-9dbfacee3144/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.003805 4747 scope.go:117] "RemoveContainer" containerID="830bd4d5341560a621378cd94f40e4c4b0401787664fe8a6b8244fa2e35b90f2" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.004872 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da23b71-6d79-4345-87e9-3528332fe50d" path="/var/lib/kubelet/pods/5da23b71-6d79-4345-87e9-3528332fe50d/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.005851 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60728349-9aa2-474c-b935-6fc915822d4e" path="/var/lib/kubelet/pods/60728349-9aa2-474c-b935-6fc915822d4e/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.006889 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee21b51-8d3a-4942-b004-53d5337da918" path="/var/lib/kubelet/pods/7ee21b51-8d3a-4942-b004-53d5337da918/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.008929 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" path="/var/lib/kubelet/pods/82c51dab-4948-4ca3-94ba-c25cb3a4e280/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.009985 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843a1ea2-9e25-4649-9375-9b41f492055c" path="/var/lib/kubelet/pods/843a1ea2-9e25-4649-9375-9b41f492055c/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.011100 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" path="/var/lib/kubelet/pods/b2f34a29-aa43-46a7-9fa8-8eead0c7ba07/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.014683 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafb6071-80cb-4a67-bb33-bc45f069937c" path="/var/lib/kubelet/pods/bafb6071-80cb-4a67-bb33-bc45f069937c/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.015473 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14b974a-62ea-4e9d-b0e2-9ab4031c35fa" path="/var/lib/kubelet/pods/c14b974a-62ea-4e9d-b0e2-9ab4031c35fa/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.016248 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" path="/var/lib/kubelet/pods/f5bfe363-0d37-4380-93a8-dc7ea1ad3392/volumes" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018230 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018266 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018286 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018299 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018312 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018324 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancee01d-account-delete-6jnvw"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018335 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018349 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018360 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018372 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.018383 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.027497 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.035606 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.044516 4747 scope.go:117] "RemoveContainer" containerID="90486f36ff75a740556644fedc7ad1ada74d9386b6373ae4001a5f28f3f94e44" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.046899 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.055175 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.059103 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.067348 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.075831 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.083289 4747 scope.go:117] "RemoveContainer" containerID="d7464171649f3f66dfa83f9f9dffbb9397e5a9c9592b2d221f844436a374209d" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.085111 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76c7dd97d4-lfdpd"] Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.090658 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.094648 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.095881 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.099286 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-646c94b678-5975b"] Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.100885 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.100952 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.107573 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.110167 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.112272 4747 scope.go:117] "RemoveContainer" containerID="a0b910eeeb5d4311cefe6de469772978a62f3bc013456ed000b893110cb13617" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.148428 4747 scope.go:117] "RemoveContainer" containerID="5609f29b50c7c1c36727d3f9b37fe61125617bad5be7ef04dd273321702420ba" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.170284 4747 scope.go:117] "RemoveContainer" containerID="7110e5080e3d96f1039156c86380e9d5d08ce0c06b6df7c2aa14c96f4b79a9a1" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193310 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvl2s\" (UniqueName: \"kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193330 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193406 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193444 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193479 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193498 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.193647 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated\") pod \"684a964a-1f18-4648-b8c7-6c8c818e16bd\" (UID: \"684a964a-1f18-4648-b8c7-6c8c818e16bd\") " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.194370 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.195143 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.196203 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.196532 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.199285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s" (OuterVolumeSpecName: "kube-api-access-wvl2s") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "kube-api-access-wvl2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.203611 4747 scope.go:117] "RemoveContainer" containerID="bf9fa6b0cf24ff77f4a354a1cad76271ec89c26ec39baabf45982f119b448d4e" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.213349 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.222352 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.231293 4747 scope.go:117] "RemoveContainer" containerID="1bc4fa29702116af6fcd59bcbbca74418678e340a4d1a0b7cd112308e8bc0b4e" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.242763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "684a964a-1f18-4648-b8c7-6c8c818e16bd" (UID: "684a964a-1f18-4648-b8c7-6c8c818e16bd"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.256450 4747 scope.go:117] "RemoveContainer" containerID="541e9e17ac501f698eab7f861a294ddf49e04de6ec4df244e43f29b858e30226" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.280721 4747 scope.go:117] "RemoveContainer" containerID="7ad7169da0407cc9fa1fb2412f29f1fddcb02d5a765a42193125782a41cbb169" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300808 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300842 4747 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300868 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300877 4747 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300886 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300895 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/684a964a-1f18-4648-b8c7-6c8c818e16bd-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300903 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvl2s\" (UniqueName: \"kubernetes.io/projected/684a964a-1f18-4648-b8c7-6c8c818e16bd-kube-api-access-wvl2s\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.300912 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a964a-1f18-4648-b8c7-6c8c818e16bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.302062 4747 scope.go:117] "RemoveContainer" containerID="454cb64b4de071bc5408ffd45978d6472ff63dcb85495e7423c6ead0953090f8" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.325383 4747 scope.go:117] "RemoveContainer" containerID="450bad02541ca9f2045559c849365b5204f10379fb98f7f20ee63d82eaedc639" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.327385 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.372442 4747 scope.go:117] "RemoveContainer" containerID="44bc0bd3c6f0a47cd14c827e342132010b773d5ce7c6430fcd671f4d1d9ea58a" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.397378 4747 scope.go:117] "RemoveContainer" containerID="1ca555e9c34cfc96cf0c94575486e1a9582744067ce3a91bdb99945c6f7243b5" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.402363 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.402453 4747 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.402518 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data podName:d49bd09d-90af-4f00-a333-0e292c581525 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:16.402502471 +0000 UTC m=+1386.869809959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data") pod "rabbitmq-server-0" (UID: "d49bd09d-90af-4f00-a333-0e292c581525") : configmap "rabbitmq-config-data" not found Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.421684 4747 scope.go:117] "RemoveContainer" containerID="2cb80132fac67dd10f92c4e06ec0d58fbd16b1a14243d5a1da9696abdd3d83b7" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.532554 4747 generic.go:334] "Generic (PLEG): container finished" podID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerID="49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421" exitCode=0 Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.532642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerDied","Data":"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421"} Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.532686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"684a964a-1f18-4648-b8c7-6c8c818e16bd","Type":"ContainerDied","Data":"ab962482f76935b4502ee2790ca852a89f44724a1702720ed7d26e2bddab3487"} Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.532704 4747 scope.go:117] "RemoveContainer" containerID="49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.532840 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.741269 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.745219 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.829555 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.829634 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:12.829618679 +0000 UTC m=+1383.296926157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.932043 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.932081 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.932096 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:12.932080881 +0000 UTC m=+1383.399388369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: E1205 21:05:08.932133 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:12.932118082 +0000 UTC m=+1383.399425570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:08 crc kubenswrapper[4747]: I1205 21:05:08.972857 4747 scope.go:117] "RemoveContainer" containerID="cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.032419 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.032514 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:13.032493944 +0000 UTC m=+1383.499801442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.104523 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.119169 4747 scope.go:117] "RemoveContainer" containerID="49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.119634 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421\": container with ID starting with 49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421 not found: ID does not exist" containerID="49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.119662 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421"} err="failed to get container status \"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421\": rpc error: code = NotFound desc = could not find container \"49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421\": container with ID starting with 49c9b5afbbe1d53dba130d5c1b62256f8de83335c25de43c8361be8a01f7d421 not found: ID does not exist" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.119685 4747 scope.go:117] "RemoveContainer" containerID="cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.132943 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3\": container with ID starting with cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3 not found: ID does not exist" containerID="cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.132992 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3"} err="failed to get container status \"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3\": rpc error: code = NotFound desc = could not find container \"cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3\": container with ID starting with cee35f79d0be8b610559ca6a26cbab40df575f856346a5dfaf22f8145550fdc3 not found: ID does not exist" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238647 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238686 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238721 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62dbq\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238756 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238831 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238902 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238932 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238960 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.238997 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.239035 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.239061 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf\") pod \"d49bd09d-90af-4f00-a333-0e292c581525\" (UID: \"d49bd09d-90af-4f00-a333-0e292c581525\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.239053 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.248045 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.250933 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.260040 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info" (OuterVolumeSpecName: "pod-info") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.267440 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.269647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.271137 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.304928 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data" (OuterVolumeSpecName: "config-data") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.306783 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq" (OuterVolumeSpecName: "kube-api-access-62dbq") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "kube-api-access-62dbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.313462 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf" (OuterVolumeSpecName: "server-conf") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340870 4747 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340904 4747 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d49bd09d-90af-4f00-a333-0e292c581525-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340914 4747 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d49bd09d-90af-4f00-a333-0e292c581525-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340923 4747 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340932 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340957 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340968 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62dbq\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-kube-api-access-62dbq\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340976 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340985 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d49bd09d-90af-4f00-a333-0e292c581525-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.340993 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.362698 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.411758 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d49bd09d-90af-4f00-a333-0e292c581525" (UID: "d49bd09d-90af-4f00-a333-0e292c581525"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.442500 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.442532 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d49bd09d-90af-4f00-a333-0e292c581525-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.443866 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.579058 4747 generic.go:334] "Generic (PLEG): container finished" podID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerID="591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79" exitCode=0 Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.579135 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.579548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerDied","Data":"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79"} Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.579809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70db507e-cc84-4722-8ac8-fd659c2803b8","Type":"ContainerDied","Data":"204d13c5b120dd8f33840db73ad85ea7769396cfb30a30e69cdda4aab0ccb346"} Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.579847 4747 scope.go:117] "RemoveContainer" containerID="591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.585148 4747 generic.go:334] "Generic (PLEG): container finished" podID="d49bd09d-90af-4f00-a333-0e292c581525" containerID="a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c" exitCode=0 Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.585244 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerDied","Data":"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c"} Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.585287 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d49bd09d-90af-4f00-a333-0e292c581525","Type":"ContainerDied","Data":"b9fd05f438e03f7aacb5322db9f9e8ef18159d42646955d6467db61ef333dc34"} Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.585306 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.654836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.654912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhth\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.654947 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.654981 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655057 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655079 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655110 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655157 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655188 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655245 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655272 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret\") pod \"70db507e-cc84-4722-8ac8-fd659c2803b8\" (UID: \"70db507e-cc84-4722-8ac8-fd659c2803b8\") " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655570 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.655774 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.656700 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.656763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.663401 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.664003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.664414 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth" (OuterVolumeSpecName: "kube-api-access-nnhth") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "kube-api-access-nnhth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.669531 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.683843 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.691056 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data" (OuterVolumeSpecName: "config-data") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.706135 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.709966 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.718440 4747 scope.go:117] "RemoveContainer" containerID="c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.720182 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763414 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhth\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-kube-api-access-nnhth\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763570 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763682 4747 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763755 4747 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70db507e-cc84-4722-8ac8-fd659c2803b8-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763854 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.763946 4747 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70db507e-cc84-4722-8ac8-fd659c2803b8-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.764024 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.764103 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.764181 4747 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70db507e-cc84-4722-8ac8-fd659c2803b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.785559 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.795749 4747 scope.go:117] "RemoveContainer" containerID="591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.801929 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79\": container with ID starting with 591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79 not found: ID does not exist" containerID="591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.801979 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79"} err="failed to get container status \"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79\": rpc error: code = NotFound desc = could not find container \"591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79\": container with ID starting with 591c966cbd67e65bbd7f9df16799bddb9e1c1aeba658fa805c2349bf30d4dd79 not found: ID does not exist" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.802003 4747 scope.go:117] "RemoveContainer" containerID="c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.805323 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f\": container with ID starting with c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f not found: ID does not exist" containerID="c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.805391 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f"} err="failed to get container status \"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f\": rpc error: code = NotFound desc = could not find container \"c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f\": container with ID starting with c547e1ccdcbdea64d31b3fa0d8e66a77f1b2d50aee1f27b3167cf9f22a5e018f not found: ID does not exist" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.805417 4747 scope.go:117] "RemoveContainer" containerID="a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c" Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.807993 4747 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 05 21:05:09 crc kubenswrapper[4747]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-05T21:05:02Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 05 21:05:09 crc kubenswrapper[4747]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 05 21:05:09 crc kubenswrapper[4747]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-ns2k6" message=< Dec 05 21:05:09 crc kubenswrapper[4747]: Exiting ovn-controller (1) [FAILED] Dec 05 21:05:09 crc kubenswrapper[4747]: Killing ovn-controller (1) [ OK ] Dec 05 21:05:09 crc kubenswrapper[4747]: 2025-12-05T21:05:02Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 05 21:05:09 crc kubenswrapper[4747]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 05 21:05:09 crc kubenswrapper[4747]: > Dec 05 21:05:09 crc kubenswrapper[4747]: E1205 21:05:09.808032 4747 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 05 21:05:09 crc kubenswrapper[4747]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-05T21:05:02Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 05 21:05:09 crc kubenswrapper[4747]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 05 21:05:09 crc kubenswrapper[4747]: > pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" containerID="cri-o://0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.808077 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ns2k6" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" containerID="cri-o://0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827" gracePeriod=22 Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.851047 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" path="/var/lib/kubelet/pods/03e0ca3b-083d-477b-a227-dc70e5204444/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.851561 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" path="/var/lib/kubelet/pods/2b545cc7-5bb8-499c-83da-93fe7cbbd6f9/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.852216 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" path="/var/lib/kubelet/pods/33e4dab2-e2a7-4fe3-949a-6a31460e11ba/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.853984 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" path="/var/lib/kubelet/pods/466afd16-6e7d-42fd-bd82-cabab660b344/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.854644 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" path="/var/lib/kubelet/pods/60c19006-e7b7-4c36-847a-a52358ae6a99/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.855277 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" path="/var/lib/kubelet/pods/62399e6a-577a-4d10-b057-49c4bae7a172/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.856565 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" path="/var/lib/kubelet/pods/684a964a-1f18-4648-b8c7-6c8c818e16bd/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.857808 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" path="/var/lib/kubelet/pods/70527d7e-feb4-4821-b20d-74d9634ab124/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.858961 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722451d6-f405-4bdc-b418-bfdaa2000197" path="/var/lib/kubelet/pods/722451d6-f405-4bdc-b418-bfdaa2000197/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.859511 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" path="/var/lib/kubelet/pods/7d3c569a-33c9-46a0-8461-e69315fbd20b/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.860602 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" path="/var/lib/kubelet/pods/ac080b76-32d3-498c-9832-d31494c1d21f/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.862236 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49bd09d-90af-4f00-a333-0e292c581525" path="/var/lib/kubelet/pods/d49bd09d-90af-4f00-a333-0e292c581525/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.862903 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" path="/var/lib/kubelet/pods/ee758f70-0c00-471e-85bf-2d4a96646d15/volumes" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.865001 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.904353 4747 scope.go:117] "RemoveContainer" containerID="2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.962759 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "70db507e-cc84-4722-8ac8-fd659c2803b8" (UID: "70db507e-cc84-4722-8ac8-fd659c2803b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:09 crc kubenswrapper[4747]: I1205 21:05:09.973382 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70db507e-cc84-4722-8ac8-fd659c2803b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.024434 4747 scope.go:117] "RemoveContainer" containerID="a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.024826 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c\": container with ID starting with a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c not found: ID does not exist" containerID="a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.024857 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c"} err="failed to get container status \"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c\": rpc error: code = NotFound desc = could not find container \"a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c\": container with ID starting with a1384543807536a88f9af6a29ae8e6d52ea9ff61702daf36b45ab2099a6bce9c not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.024876 4747 scope.go:117] "RemoveContainer" containerID="2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.025047 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3\": container with ID starting with 2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3 not found: ID does not exist" containerID="2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.025064 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3"} err="failed to get container status \"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3\": rpc error: code = NotFound desc = could not find container \"2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3\": container with ID starting with 2a95ab7afe7b5d3fb966c635f9f0ead6c5e7cacb8f8ffa5421f5a4bb80ec9ec3 not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.079166 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.234720 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.277820 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.277868 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgnx\" (UniqueName: \"kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.277934 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.277965 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.277991 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.278011 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.278088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.278120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs\") pod \"8be50198-f9c5-4e90-bfa0-d33b502278b7\" (UID: \"8be50198-f9c5-4e90-bfa0-d33b502278b7\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.290321 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.290370 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.292466 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.293920 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts" (OuterVolumeSpecName: "scripts") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.294379 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.298915 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx" (OuterVolumeSpecName: "kube-api-access-xkgnx") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "kube-api-access-xkgnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.326141 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.356407 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data" (OuterVolumeSpecName: "config-data") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.387247 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data\") pod \"1d350f3b-2497-4941-a006-84a503604020\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.387283 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle\") pod \"1d350f3b-2497-4941-a006-84a503604020\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.387922 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs\") pod \"1d350f3b-2497-4941-a006-84a503604020\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.387947 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom\") pod \"1d350f3b-2497-4941-a006-84a503604020\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388035 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bblx8\" (UniqueName: \"kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8\") pod \"1d350f3b-2497-4941-a006-84a503604020\" (UID: \"1d350f3b-2497-4941-a006-84a503604020\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388242 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388257 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388266 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgnx\" (UniqueName: \"kubernetes.io/projected/8be50198-f9c5-4e90-bfa0-d33b502278b7-kube-api-access-xkgnx\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388274 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388282 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.388289 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.392101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs" (OuterVolumeSpecName: "logs") pod "1d350f3b-2497-4941-a006-84a503604020" (UID: "1d350f3b-2497-4941-a006-84a503604020"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.395442 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8" (OuterVolumeSpecName: "kube-api-access-bblx8") pod "1d350f3b-2497-4941-a006-84a503604020" (UID: "1d350f3b-2497-4941-a006-84a503604020"). InnerVolumeSpecName "kube-api-access-bblx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.396526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d350f3b-2497-4941-a006-84a503604020" (UID: "1d350f3b-2497-4941-a006-84a503604020"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.403192 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.421726 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d350f3b-2497-4941-a006-84a503604020" (UID: "1d350f3b-2497-4941-a006-84a503604020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.439777 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8be50198-f9c5-4e90-bfa0-d33b502278b7" (UID: "8be50198-f9c5-4e90-bfa0-d33b502278b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.463467 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data" (OuterVolumeSpecName: "config-data") pod "1d350f3b-2497-4941-a006-84a503604020" (UID: "1d350f3b-2497-4941-a006-84a503604020"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491599 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491629 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bblx8\" (UniqueName: \"kubernetes.io/projected/1d350f3b-2497-4941-a006-84a503604020-kube-api-access-bblx8\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491645 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491656 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491667 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d350f3b-2497-4941-a006-84a503604020-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491677 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d350f3b-2497-4941-a006-84a503604020-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.491688 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be50198-f9c5-4e90-bfa0-d33b502278b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.550366 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.554416 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.559648 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.592822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593013 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593698 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwsl\" (UniqueName: \"kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl\") pod \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593732 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle\") pod \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593760 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt2pr\" (UniqueName: \"kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr\") pod \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593805 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data\") pod \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593832 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom\") pod \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593860 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlgwj\" (UniqueName: \"kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593884 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs\") pod \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593935 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593959 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle\") pod \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\" (UID: \"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.593986 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.594007 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts\") pod \"075d1135-1337-43d3-83e1-97b942a03786\" (UID: \"075d1135-1337-43d3-83e1-97b942a03786\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.594032 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data\") pod \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\" (UID: \"0ae4d823-7941-4e9e-ae9d-cce0297e278d\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.596599 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/075d1135-1337-43d3-83e1-97b942a03786-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.600228 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" (UID: "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.603170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr" (OuterVolumeSpecName: "kube-api-access-nt2pr") pod "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" (UID: "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb"). InnerVolumeSpecName "kube-api-access-nt2pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.603673 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs" (OuterVolumeSpecName: "logs") pod "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" (UID: "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.613271 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.616073 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj" (OuterVolumeSpecName: "kube-api-access-jlgwj") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "kube-api-access-jlgwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.619320 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts" (OuterVolumeSpecName: "scripts") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.619404 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl" (OuterVolumeSpecName: "kube-api-access-dlwsl") pod "0ae4d823-7941-4e9e-ae9d-cce0297e278d" (UID: "0ae4d823-7941-4e9e-ae9d-cce0297e278d"). InnerVolumeSpecName "kube-api-access-dlwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.628709 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data" (OuterVolumeSpecName: "config-data") pod "0ae4d823-7941-4e9e-ae9d-cce0297e278d" (UID: "0ae4d823-7941-4e9e-ae9d-cce0297e278d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.629775 4747 generic.go:334] "Generic (PLEG): container finished" podID="1d350f3b-2497-4941-a006-84a503604020" containerID="0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad" exitCode=0 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.629855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerDied","Data":"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.629892 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" event={"ID":"1d350f3b-2497-4941-a006-84a503604020","Type":"ContainerDied","Data":"562083f425c5155fb9a9e6c69bd422cff4b04b2901e08f1511753317903e550f"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.629935 4747 scope.go:117] "RemoveContainer" containerID="0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.630066 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c486969dd-92bj4" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.636051 4747 generic.go:334] "Generic (PLEG): container finished" podID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerID="0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede" exitCode=0 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.636123 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.636474 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerDied","Data":"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.636721 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c8ddf7b47-hbwzt" event={"ID":"d6b073c9-c8ff-4de9-b2ee-605d37c94ffb","Type":"ContainerDied","Data":"82a38a3e532801720fed70d38f5f7c1838153783bf5f0683b4cd4f511cebaabf"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.641063 4747 generic.go:334] "Generic (PLEG): container finished" podID="075d1135-1337-43d3-83e1-97b942a03786" containerID="f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9" exitCode=0 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.641134 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerDied","Data":"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.641162 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"075d1135-1337-43d3-83e1-97b942a03786","Type":"ContainerDied","Data":"a9582eb062301399d5db3aba360512c046df4a6048b500b273acf878fa66b80a"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.641215 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.645309 4747 generic.go:334] "Generic (PLEG): container finished" podID="8be50198-f9c5-4e90-bfa0-d33b502278b7" containerID="2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f" exitCode=0 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.645410 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79977d44cb-v5vtt" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.645425 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79977d44cb-v5vtt" event={"ID":"8be50198-f9c5-4e90-bfa0-d33b502278b7","Type":"ContainerDied","Data":"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.646014 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79977d44cb-v5vtt" event={"ID":"8be50198-f9c5-4e90-bfa0-d33b502278b7","Type":"ContainerDied","Data":"0a1d8bfdb33b3e004235e179e6fdb6dced1c5448f7a86096b352ca7094155c6a"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.663689 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ns2k6_c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4/ovn-controller/0.log" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.663731 4747 generic.go:334] "Generic (PLEG): container finished" podID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerID="0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827" exitCode=139 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.663811 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6" event={"ID":"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4","Type":"ContainerDied","Data":"0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.668378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data" (OuterVolumeSpecName: "config-data") pod "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" (UID: "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.669255 4747 generic.go:334] "Generic (PLEG): container finished" podID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" exitCode=0 Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.669317 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.669353 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0ae4d823-7941-4e9e-ae9d-cce0297e278d","Type":"ContainerDied","Data":"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.669527 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0ae4d823-7941-4e9e-ae9d-cce0297e278d","Type":"ContainerDied","Data":"0febb20caca0ece73074d303d607f4bfb22e88ee8aac27d88d6f27b67f55a0c9"} Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.672351 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.684567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.685004 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5c486969dd-92bj4"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.685943 4747 scope.go:117] "RemoveContainer" containerID="0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.701811 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702131 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwsl\" (UniqueName: \"kubernetes.io/projected/0ae4d823-7941-4e9e-ae9d-cce0297e278d-kube-api-access-dlwsl\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702144 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt2pr\" (UniqueName: \"kubernetes.io/projected/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-kube-api-access-nt2pr\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702153 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702162 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702170 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlgwj\" (UniqueName: \"kubernetes.io/projected/075d1135-1337-43d3-83e1-97b942a03786-kube-api-access-jlgwj\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702178 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702186 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702194 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.702202 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.704414 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ns2k6_c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4/ovn-controller/0.log" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.704485 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.712156 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ae4d823-7941-4e9e-ae9d-cce0297e278d" (UID: "0ae4d823-7941-4e9e-ae9d-cce0297e278d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.712268 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.712908 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" (UID: "d6b073c9-c8ff-4de9-b2ee-605d37c94ffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.720783 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-79977d44cb-v5vtt"] Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.730672 4747 scope.go:117] "RemoveContainer" containerID="0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.731917 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad\": container with ID starting with 0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad not found: ID does not exist" containerID="0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.731951 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad"} err="failed to get container status \"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad\": rpc error: code = NotFound desc = could not find container \"0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad\": container with ID starting with 0aaabbe10c0f61cef65c493fabec256d8c68d50a96544ca2c165bc1b095b1cad not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.731973 4747 scope.go:117] "RemoveContainer" containerID="0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.732489 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb\": container with ID starting with 0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb not found: ID does not exist" containerID="0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.732510 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb"} err="failed to get container status \"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb\": rpc error: code = NotFound desc = could not find container \"0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb\": container with ID starting with 0bf651a4e254735c050f3d0a785a6647dbfeab955170d1824e08ea2a7638c8eb not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.732547 4747 scope.go:117] "RemoveContainer" containerID="0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.745289 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data" (OuterVolumeSpecName: "config-data") pod "075d1135-1337-43d3-83e1-97b942a03786" (UID: "075d1135-1337-43d3-83e1-97b942a03786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.785046 4747 scope.go:117] "RemoveContainer" containerID="fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.806208 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075d1135-1337-43d3-83e1-97b942a03786-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.806254 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.806269 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae4d823-7941-4e9e-ae9d-cce0297e278d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.841698 4747 scope.go:117] "RemoveContainer" containerID="0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.842298 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede\": container with ID starting with 0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede not found: ID does not exist" containerID="0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.842329 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede"} err="failed to get container status \"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede\": rpc error: code = NotFound desc = could not find container \"0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede\": container with ID starting with 0ee3d3f658fc75a010904f01ce363f8a9cb5878b73525714b1b5443773182ede not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.842349 4747 scope.go:117] "RemoveContainer" containerID="fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.842756 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde\": container with ID starting with fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde not found: ID does not exist" containerID="fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.842782 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde"} err="failed to get container status \"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde\": rpc error: code = NotFound desc = could not find container \"fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde\": container with ID starting with fc1b59248e10c987bf66294f5fd98a70e8177f8a073667376335d7e63c392cde not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.842798 4747 scope.go:117] "RemoveContainer" containerID="b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907305 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907414 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907459 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlhm8\" (UniqueName: \"kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907682 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts\") pod \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\" (UID: \"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4\") " Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907935 4747 scope.go:117] "RemoveContainer" containerID="f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907969 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.907923 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run" (OuterVolumeSpecName: "var-run") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.908786 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.908985 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.909094 4747 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.910144 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts" (OuterVolumeSpecName: "scripts") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.912271 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8" (OuterVolumeSpecName: "kube-api-access-nlhm8") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "kube-api-access-nlhm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.937486 4747 scope.go:117] "RemoveContainer" containerID="b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.938299 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.939045 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21\": container with ID starting with b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21 not found: ID does not exist" containerID="b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.939081 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21"} err="failed to get container status \"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21\": rpc error: code = NotFound desc = could not find container \"b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21\": container with ID starting with b7d08e884c92fb4e03551bbd9a3a91bac164d801a2b9b5ac7e804cc1a663aa21 not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.939108 4747 scope.go:117] "RemoveContainer" containerID="f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.939390 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9\": container with ID starting with f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9 not found: ID does not exist" containerID="f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.939421 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9"} err="failed to get container status \"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9\": rpc error: code = NotFound desc = could not find container \"f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9\": container with ID starting with f8bc67466c94b3c50d9c65f0be95a52aaf42b7cbc0b7be9125d7111ca8f375c9 not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.939438 4747 scope.go:117] "RemoveContainer" containerID="2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.975827 4747 scope.go:117] "RemoveContainer" containerID="2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f" Dec 05 21:05:10 crc kubenswrapper[4747]: E1205 21:05:10.978129 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f\": container with ID starting with 2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f not found: ID does not exist" containerID="2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.978154 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f"} err="failed to get container status \"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f\": rpc error: code = NotFound desc = could not find container \"2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f\": container with ID starting with 2331b1b6a8ba25cc12686ca0340fe77d732e59d87aa1e0ab064d2f3d4b62be8f not found: ID does not exist" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.978175 4747 scope.go:117] "RemoveContainer" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" Dec 05 21:05:10 crc kubenswrapper[4747]: I1205 21:05:10.981398 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.105:11211: i/o timeout" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.002931 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.010115 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.010149 4747 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.010159 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.010169 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlhm8\" (UniqueName: \"kubernetes.io/projected/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-kube-api-access-nlhm8\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.010622 4747 scope.go:117] "RemoveContainer" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.011111 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2\": container with ID starting with 558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2 not found: ID does not exist" containerID="558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.011142 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2"} err="failed to get container status \"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2\": rpc error: code = NotFound desc = could not find container \"558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2\": container with ID starting with 558fcb091aed896686511ccd6d8571d39116782f4936587df2ab0278677ab6b2 not found: ID does not exist" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.017189 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-c8ddf7b47-hbwzt"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.040964 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.047116 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" (UID: "c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.047172 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.053825 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.059484 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.111287 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.111818 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.112183 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.112462 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.112500 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.118787 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.120239 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.123020 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.123091 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.165304 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.313910 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314039 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314067 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314132 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314164 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314189 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314209 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314235 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklfk\" (UniqueName: \"kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk\") pod \"9da10f9e-c9d3-4d5e-888a-774080f417f8\" (UID: \"9da10f9e-c9d3-4d5e-888a-774080f417f8\") " Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314752 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.314849 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.317554 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk" (OuterVolumeSpecName: "kube-api-access-qklfk") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "kube-api-access-qklfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.317661 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts" (OuterVolumeSpecName: "scripts") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.344802 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.359298 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.379121 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.407599 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data" (OuterVolumeSpecName: "config-data") pod "9da10f9e-c9d3-4d5e-888a-774080f417f8" (UID: "9da10f9e-c9d3-4d5e-888a-774080f417f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.418987 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419465 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419481 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419493 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklfk\" (UniqueName: \"kubernetes.io/projected/9da10f9e-c9d3-4d5e-888a-774080f417f8-kube-api-access-qklfk\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419502 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419511 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9da10f9e-c9d3-4d5e-888a-774080f417f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.419562 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9da10f9e-c9d3-4d5e-888a-774080f417f8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.697041 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ns2k6_c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4/ovn-controller/0.log" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.697170 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ns2k6" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.697366 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ns2k6" event={"ID":"c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4","Type":"ContainerDied","Data":"d54e16885314b407a47de012ca616caef1edcd1f5b8b60a9f6e5eb197deb1d7a"} Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.697472 4747 scope.go:117] "RemoveContainer" containerID="0e6d2d436da5eb58728480f027b6f9382e5a4e2d75ab50a311d5115475bf0827" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.707893 4747 generic.go:334] "Generic (PLEG): container finished" podID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerID="39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac" exitCode=0 Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.707960 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerDied","Data":"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac"} Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.707973 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.707985 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9da10f9e-c9d3-4d5e-888a-774080f417f8","Type":"ContainerDied","Data":"53a6a28aa82b52a7c750d642ef9a0bfb6b383e264098f48634cc0205322e678f"} Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.725068 4747 scope.go:117] "RemoveContainer" containerID="4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.741705 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.755830 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ns2k6"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.758897 4747 scope.go:117] "RemoveContainer" containerID="443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.761020 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.765880 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.773765 4747 scope.go:117] "RemoveContainer" containerID="39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.788716 4747 scope.go:117] "RemoveContainer" containerID="fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.806210 4747 scope.go:117] "RemoveContainer" containerID="4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.806748 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba\": container with ID starting with 4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba not found: ID does not exist" containerID="4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.806791 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba"} err="failed to get container status \"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba\": rpc error: code = NotFound desc = could not find container \"4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba\": container with ID starting with 4b45c5494c9cb7801c132de4820fe650677d60312b4084726a8d83cfd974bcba not found: ID does not exist" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.806815 4747 scope.go:117] "RemoveContainer" containerID="443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.807061 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc\": container with ID starting with 443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc not found: ID does not exist" containerID="443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.807095 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc"} err="failed to get container status \"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc\": rpc error: code = NotFound desc = could not find container \"443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc\": container with ID starting with 443c78cd3406277fd83faca12b46a2e8a254fa280e61add86bbe8dd8b77f93bc not found: ID does not exist" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.807115 4747 scope.go:117] "RemoveContainer" containerID="39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.807574 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac\": container with ID starting with 39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac not found: ID does not exist" containerID="39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.807625 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac"} err="failed to get container status \"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac\": rpc error: code = NotFound desc = could not find container \"39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac\": container with ID starting with 39298752a8b51d1caae7019505b33e7f0b9b3bf5239de952b9fe17241b3c89ac not found: ID does not exist" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.807644 4747 scope.go:117] "RemoveContainer" containerID="fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b" Dec 05 21:05:11 crc kubenswrapper[4747]: E1205 21:05:11.807898 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b\": container with ID starting with fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b not found: ID does not exist" containerID="fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.807927 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b"} err="failed to get container status \"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b\": rpc error: code = NotFound desc = could not find container \"fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b\": container with ID starting with fc429219dd2c874c7566f484cc0af9643eb08ad7677e4bfb2ed48f5441db885b not found: ID does not exist" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.859100 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075d1135-1337-43d3-83e1-97b942a03786" path="/var/lib/kubelet/pods/075d1135-1337-43d3-83e1-97b942a03786/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.859705 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" path="/var/lib/kubelet/pods/0ae4d823-7941-4e9e-ae9d-cce0297e278d/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.860222 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d350f3b-2497-4941-a006-84a503604020" path="/var/lib/kubelet/pods/1d350f3b-2497-4941-a006-84a503604020/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.861484 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" path="/var/lib/kubelet/pods/70db507e-cc84-4722-8ac8-fd659c2803b8/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.862028 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be50198-f9c5-4e90-bfa0-d33b502278b7" path="/var/lib/kubelet/pods/8be50198-f9c5-4e90-bfa0-d33b502278b7/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.862485 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" path="/var/lib/kubelet/pods/9da10f9e-c9d3-4d5e-888a-774080f417f8/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.863528 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" path="/var/lib/kubelet/pods/c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4/volumes" Dec 05 21:05:11 crc kubenswrapper[4747]: I1205 21:05:11.864047 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" path="/var/lib/kubelet/pods/d6b073c9-c8ff-4de9-b2ee-605d37c94ffb/volumes" Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.843510 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.843642 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:20.843616853 +0000 UTC m=+1391.310924381 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.944787 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.944878 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:20.944858206 +0000 UTC m=+1391.412165694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.944934 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:12 crc kubenswrapper[4747]: E1205 21:05:12.945006 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:20.944984219 +0000 UTC m=+1391.412291807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:13 crc kubenswrapper[4747]: E1205 21:05:13.046921 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:13 crc kubenswrapper[4747]: E1205 21:05:13.047005 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:21.04698389 +0000 UTC m=+1391.514291378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:13 crc kubenswrapper[4747]: E1205 21:05:13.645712 4747 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/2f396b1afd35a1fd6063c6636593d3635148050293d5486f88446615c7cde329/diff" to get inode usage: stat /var/lib/containers/storage/overlay/2f396b1afd35a1fd6063c6636593d3635148050293d5486f88446615c7cde329/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-cell0-conductor-0_0ae4d823-7941-4e9e-ae9d-cce0297e278d/nova-cell0-conductor-conductor/0.log" to get inode usage: stat /var/log/pods/openstack_nova-cell0-conductor-0_0ae4d823-7941-4e9e-ae9d-cce0297e278d/nova-cell0-conductor-conductor/0.log: no such file or directory Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.110981 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.111813 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.112175 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.112206 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.125078 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.127690 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.137681 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:16 crc kubenswrapper[4747]: E1205 21:05:16.137771 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:19 crc kubenswrapper[4747]: I1205 21:05:19.803943 4747 generic.go:334] "Generic (PLEG): container finished" podID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerID="5ff576908ae6828159afb6725bcd8f5262dbf45a7c1d981f6bf781737d654133" exitCode=0 Dec 05 21:05:19 crc kubenswrapper[4747]: I1205 21:05:19.803981 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerDied","Data":"5ff576908ae6828159afb6725bcd8f5262dbf45a7c1d981f6bf781737d654133"} Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.105110 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258183 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbvq5\" (UniqueName: \"kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258357 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258391 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258408 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.258479 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config\") pod \"8ded45d9-c7e1-4429-982b-4f7c10e43473\" (UID: \"8ded45d9-c7e1-4429-982b-4f7c10e43473\") " Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.270720 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.271238 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5" (OuterVolumeSpecName: "kube-api-access-gbvq5") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "kube-api-access-gbvq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.320038 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.336067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.340564 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config" (OuterVolumeSpecName: "config") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.344263 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.349219 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ded45d9-c7e1-4429-982b-4f7c10e43473" (UID: "8ded45d9-c7e1-4429-982b-4f7c10e43473"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360426 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbvq5\" (UniqueName: \"kubernetes.io/projected/8ded45d9-c7e1-4429-982b-4f7c10e43473-kube-api-access-gbvq5\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360453 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360462 4747 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360471 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360480 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360489 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.360499 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ded45d9-c7e1-4429-982b-4f7c10e43473-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.833101 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-855575d477-mmmm7" event={"ID":"8ded45d9-c7e1-4429-982b-4f7c10e43473","Type":"ContainerDied","Data":"b290fe5e98fd92428554809605fd233802c8cb8813e0e46ca4e34212ca87b615"} Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.834297 4747 scope.go:117] "RemoveContainer" containerID="0828c77a89d68fa7ed5a9d98a5ec301c313e92a36be991d033d8a0ce476a2939" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.833286 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-855575d477-mmmm7" Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.867929 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.868805 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:36.868783504 +0000 UTC m=+1407.336091012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.871190 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.876042 4747 scope.go:117] "RemoveContainer" containerID="5ff576908ae6828159afb6725bcd8f5262dbf45a7c1d981f6bf781737d654133" Dec 05 21:05:20 crc kubenswrapper[4747]: I1205 21:05:20.876408 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-855575d477-mmmm7"] Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.968814 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.968899 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:05:36.968882238 +0000 UTC m=+1407.436189726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.968825 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:20 crc kubenswrapper[4747]: E1205 21:05:20.968979 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:36.96896818 +0000 UTC m=+1407.436275668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.070447 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.070518 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:05:37.070502081 +0000 UTC m=+1407.537809579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.111013 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.112018 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.112426 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.112508 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.113336 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.115210 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.117310 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:21 crc kubenswrapper[4747]: E1205 21:05:21.117523 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:21 crc kubenswrapper[4747]: I1205 21:05:21.849993 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" path="/var/lib/kubelet/pods/8ded45d9-c7e1-4429-982b-4f7c10e43473/volumes" Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.111011 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.115418 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.115504 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.118088 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.118163 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.118204 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.120660 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:26 crc kubenswrapper[4747]: E1205 21:05:26.120722 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.113944 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.115218 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.115765 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.115836 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.115917 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.121790 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.124313 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.124376 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-4d2dp" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:31 crc kubenswrapper[4747]: E1205 21:05:31.419120 4747 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a1e298eaf1917bd0645cb66236a223bba64729c7713e2398702d5649d1553ebe/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a1e298eaf1917bd0645cb66236a223bba64729c7713e2398702d5649d1553ebe/diff: no such file or directory, extraDiskErr: Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.814417 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.944169 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") pod \"5f06375c-a008-4d1f-b2ae-516549bcd438\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.944348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsrqr\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr\") pod \"5f06375c-a008-4d1f-b2ae-516549bcd438\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.944437 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache\") pod \"5f06375c-a008-4d1f-b2ae-516549bcd438\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.944496 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock\") pod \"5f06375c-a008-4d1f-b2ae-516549bcd438\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.944547 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"5f06375c-a008-4d1f-b2ae-516549bcd438\" (UID: \"5f06375c-a008-4d1f-b2ae-516549bcd438\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.945103 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock" (OuterVolumeSpecName: "lock") pod "5f06375c-a008-4d1f-b2ae-516549bcd438" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.945156 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache" (OuterVolumeSpecName: "cache") pod "5f06375c-a008-4d1f-b2ae-516549bcd438" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.950955 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5f06375c-a008-4d1f-b2ae-516549bcd438" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.951069 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "5f06375c-a008-4d1f-b2ae-516549bcd438" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.951778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr" (OuterVolumeSpecName: "kube-api-access-bsrqr") pod "5f06375c-a008-4d1f-b2ae-516549bcd438" (UID: "5f06375c-a008-4d1f-b2ae-516549bcd438"). InnerVolumeSpecName "kube-api-access-bsrqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.952101 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4d2dp_b37900c0-058e-4be6-9b81-c67d5f05b7a5/ovs-vswitchd/0.log" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.952971 4747 generic.go:334] "Generic (PLEG): container finished" podID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" exitCode=137 Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.953037 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerDied","Data":"ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8"} Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.961529 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerID="486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf" exitCode=137 Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.961611 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf"} Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.961657 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5f06375c-a008-4d1f-b2ae-516549bcd438","Type":"ContainerDied","Data":"d5868a55bac4c9ebdd99869e4b36d69926980b1cb161056b6e40c4973fc1bb3e"} Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.961680 4747 scope.go:117] "RemoveContainer" containerID="486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:32.961740 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.050220 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.050245 4747 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.050255 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsrqr\" (UniqueName: \"kubernetes.io/projected/5f06375c-a008-4d1f-b2ae-516549bcd438-kube-api-access-bsrqr\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.050264 4747 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-cache\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.050273 4747 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5f06375c-a008-4d1f-b2ae-516549bcd438-lock\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.061658 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.066032 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.066394 4747 scope.go:117] "RemoveContainer" containerID="b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.069916 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.092214 4747 scope.go:117] "RemoveContainer" containerID="0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.112123 4747 scope.go:117] "RemoveContainer" containerID="2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.133654 4747 scope.go:117] "RemoveContainer" containerID="c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.151870 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.157142 4747 scope.go:117] "RemoveContainer" containerID="8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.177833 4747 scope.go:117] "RemoveContainer" containerID="5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.195670 4747 scope.go:117] "RemoveContainer" containerID="9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.215377 4747 scope.go:117] "RemoveContainer" containerID="50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.233183 4747 scope.go:117] "RemoveContainer" containerID="d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.249109 4747 scope.go:117] "RemoveContainer" containerID="a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.266194 4747 scope.go:117] "RemoveContainer" containerID="e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.280856 4747 scope.go:117] "RemoveContainer" containerID="b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.306601 4747 scope.go:117] "RemoveContainer" containerID="a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.325312 4747 scope.go:117] "RemoveContainer" containerID="dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.332605 4747 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/387d33bb77115e465ca6043e75a8288fc91549b0e9b4d17b873941d66a217eed/diff" to get inode usage: stat /var/lib/containers/storage/overlay/387d33bb77115e465ca6043e75a8288fc91549b0e9b4d17b873941d66a217eed/diff: no such file or directory, extraDiskErr: Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.347654 4747 scope.go:117] "RemoveContainer" containerID="486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.348044 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf\": container with ID starting with 486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf not found: ID does not exist" containerID="486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348094 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf"} err="failed to get container status \"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf\": rpc error: code = NotFound desc = could not find container \"486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf\": container with ID starting with 486c4e14776dcac9c407e97103f33e69775d09dadf5c06010aefbcfb88eedabf not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348129 4747 scope.go:117] "RemoveContainer" containerID="b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.348476 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9\": container with ID starting with b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9 not found: ID does not exist" containerID="b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348508 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9"} err="failed to get container status \"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9\": rpc error: code = NotFound desc = could not find container \"b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9\": container with ID starting with b38b92392e194449e12d772885bfbe0631d28151913d40ee252a46ebfcac23a9 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348527 4747 scope.go:117] "RemoveContainer" containerID="0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.348905 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d\": container with ID starting with 0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d not found: ID does not exist" containerID="0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348935 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d"} err="failed to get container status \"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d\": rpc error: code = NotFound desc = could not find container \"0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d\": container with ID starting with 0da635b7389cbe492ef8068ecb92ce39082021af205f7581dc37d0416b62d81d not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.348952 4747 scope.go:117] "RemoveContainer" containerID="2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.349311 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1\": container with ID starting with 2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1 not found: ID does not exist" containerID="2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.349343 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1"} err="failed to get container status \"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1\": rpc error: code = NotFound desc = could not find container \"2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1\": container with ID starting with 2ed7bcbfbf04d5a9964de2fcbd02ac674b695f20ec08f744e393a02dbb2319a1 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.349360 4747 scope.go:117] "RemoveContainer" containerID="c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.349633 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17\": container with ID starting with c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17 not found: ID does not exist" containerID="c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.349663 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17"} err="failed to get container status \"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17\": rpc error: code = NotFound desc = could not find container \"c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17\": container with ID starting with c58946c3b64e9f0075c91ec12003bcf8230437fdc679cb562ed1bcfde6656f17 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.349680 4747 scope.go:117] "RemoveContainer" containerID="8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.350014 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca\": container with ID starting with 8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca not found: ID does not exist" containerID="8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350045 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca"} err="failed to get container status \"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca\": rpc error: code = NotFound desc = could not find container \"8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca\": container with ID starting with 8df9c3dc8ff772526cc2df3f4cf0b95ab3c61db6ed0714a109c7c21fc522f7ca not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350066 4747 scope.go:117] "RemoveContainer" containerID="5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.350338 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53\": container with ID starting with 5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53 not found: ID does not exist" containerID="5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350363 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53"} err="failed to get container status \"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53\": rpc error: code = NotFound desc = could not find container \"5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53\": container with ID starting with 5e1fecd6cc1598b9e147bbd014d5d212dbafa891089d5772bdb6d3d1d8de6c53 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350377 4747 scope.go:117] "RemoveContainer" containerID="9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.350723 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0\": container with ID starting with 9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0 not found: ID does not exist" containerID="9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350756 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0"} err="failed to get container status \"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0\": rpc error: code = NotFound desc = could not find container \"9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0\": container with ID starting with 9d583a1e65e3cc8bb6a32ad2fd1c4bf96b0bd47ed6506d82064a31bc2655c7b0 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.350779 4747 scope.go:117] "RemoveContainer" containerID="50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.351094 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273\": container with ID starting with 50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273 not found: ID does not exist" containerID="50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351122 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273"} err="failed to get container status \"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273\": rpc error: code = NotFound desc = could not find container \"50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273\": container with ID starting with 50395557a436afbba90e0822ef360ef212729acd9b1d1016e656ed537b583273 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351136 4747 scope.go:117] "RemoveContainer" containerID="d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.351397 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2\": container with ID starting with d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2 not found: ID does not exist" containerID="d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351422 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2"} err="failed to get container status \"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2\": rpc error: code = NotFound desc = could not find container \"d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2\": container with ID starting with d7cf0106124fcf9a17a54c633a63891b2f79f3e809e2de080af793585eed32c2 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351433 4747 scope.go:117] "RemoveContainer" containerID="a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.351731 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b\": container with ID starting with a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b not found: ID does not exist" containerID="a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351761 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b"} err="failed to get container status \"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b\": rpc error: code = NotFound desc = could not find container \"a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b\": container with ID starting with a9b9bd53a229a7fb0fd49713a723ca5f20b761cdb3553b0052642a04bcd05a2b not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.351777 4747 scope.go:117] "RemoveContainer" containerID="e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.352016 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf\": container with ID starting with e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf not found: ID does not exist" containerID="e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352040 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf"} err="failed to get container status \"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf\": rpc error: code = NotFound desc = could not find container \"e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf\": container with ID starting with e06576b6cbbf857bd731def8848ccf981cbbccf87721760f51553e2b820943bf not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352056 4747 scope.go:117] "RemoveContainer" containerID="b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.352309 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4\": container with ID starting with b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4 not found: ID does not exist" containerID="b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352335 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4"} err="failed to get container status \"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4\": rpc error: code = NotFound desc = could not find container \"b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4\": container with ID starting with b1cda5f48990643299a63d3ae2da74d21f7ef5b4082ffa29af733c5ee782a4e4 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352357 4747 scope.go:117] "RemoveContainer" containerID="a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.352660 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d\": container with ID starting with a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d not found: ID does not exist" containerID="a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352681 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d"} err="failed to get container status \"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d\": rpc error: code = NotFound desc = could not find container \"a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d\": container with ID starting with a606768279b9932799614e2c0267efce661411007308c7fc421cef56fc7ab98d not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352693 4747 scope.go:117] "RemoveContainer" containerID="dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0" Dec 05 21:05:33 crc kubenswrapper[4747]: E1205 21:05:33.352929 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0\": container with ID starting with dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0 not found: ID does not exist" containerID="dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.352951 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0"} err="failed to get container status \"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0\": rpc error: code = NotFound desc = could not find container \"dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0\": container with ID starting with dc4ee1b50dd1745ed998bec26be98d427e85f0f14c52da0a5a910ef4a7470be0 not found: ID does not exist" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.664236 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4d2dp_b37900c0-058e-4be6-9b81-c67d5f05b7a5/ovs-vswitchd/0.log" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.664990 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759276 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759401 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759455 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log" (OuterVolumeSpecName: "var-log") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run" (OuterVolumeSpecName: "var-run") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759602 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib" (OuterVolumeSpecName: "var-lib") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759618 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s9tt\" (UniqueName: \"kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs\") pod \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\" (UID: \"b37900c0-058e-4be6-9b81-c67d5f05b7a5\") " Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.759741 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.760013 4747 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.760025 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.760034 4747 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-var-lib\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.760042 4747 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b37900c0-058e-4be6-9b81-c67d5f05b7a5-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.761184 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts" (OuterVolumeSpecName: "scripts") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.764775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt" (OuterVolumeSpecName: "kube-api-access-6s9tt") pod "b37900c0-058e-4be6-9b81-c67d5f05b7a5" (UID: "b37900c0-058e-4be6-9b81-c67d5f05b7a5"). InnerVolumeSpecName "kube-api-access-6s9tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.851763 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" path="/var/lib/kubelet/pods/5f06375c-a008-4d1f-b2ae-516549bcd438/volumes" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.862194 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s9tt\" (UniqueName: \"kubernetes.io/projected/b37900c0-058e-4be6-9b81-c67d5f05b7a5-kube-api-access-6s9tt\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.862254 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37900c0-058e-4be6-9b81-c67d5f05b7a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.975174 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4d2dp_b37900c0-058e-4be6-9b81-c67d5f05b7a5/ovs-vswitchd/0.log" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.976003 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4d2dp" event={"ID":"b37900c0-058e-4be6-9b81-c67d5f05b7a5","Type":"ContainerDied","Data":"f900224ff5a45a57a4549589617ac9b7019147b59989ea6afbc7ae5b729b3a2c"} Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.976051 4747 scope.go:117] "RemoveContainer" containerID="ac0d8f5b1cd85d94cd2934081191ea53871a8c301b38ae7bb226400d19f6b1b8" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.976054 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4d2dp" Dec 05 21:05:33 crc kubenswrapper[4747]: I1205 21:05:33.999810 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.007457 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-4d2dp"] Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.012606 4747 scope.go:117] "RemoveContainer" containerID="5c04287ef474ad9f5e2b1e91a670d3e5159d17410f006c6a67f9839a282f51f6" Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.039981 4747 scope.go:117] "RemoveContainer" containerID="a49cc62b0001176f106cecb1bb2d14cea929220be3afa1a039e1dab6d767683e" Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.103675 4747 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8bc768ee-26e3-43fb-b64d-7fc3f3c627c5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8bc768ee-26e3-43fb-b64d-7fc3f3c627c5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8bc768ee_26e3_43fb_b64d_7fc3f3c627c5.slice" Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.141029 4747 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9cc30174-a5ca-454e-b049-59a62d358d8a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9cc30174-a5ca-454e-b049-59a62d358d8a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9cc30174_a5ca_454e_b049_59a62d358d8a.slice" Dec 05 21:05:34 crc kubenswrapper[4747]: E1205 21:05:34.141107 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod9cc30174-a5ca-454e-b049-59a62d358d8a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod9cc30174-a5ca-454e-b049-59a62d358d8a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9cc30174_a5ca_454e_b049_59a62d358d8a.slice" pod="openstack/ovsdbserver-sb-0" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" Dec 05 21:05:34 crc kubenswrapper[4747]: I1205 21:05:34.995806 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 21:05:35 crc kubenswrapper[4747]: I1205 21:05:35.060743 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 21:05:35 crc kubenswrapper[4747]: I1205 21:05:35.072355 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 21:05:35 crc kubenswrapper[4747]: I1205 21:05:35.862143 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" path="/var/lib/kubelet/pods/9cc30174-a5ca-454e-b049-59a62d358d8a/volumes" Dec 05 21:05:35 crc kubenswrapper[4747]: I1205 21:05:35.864486 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" path="/var/lib/kubelet/pods/b37900c0-058e-4be6-9b81-c67d5f05b7a5/volumes" Dec 05 21:05:36 crc kubenswrapper[4747]: E1205 21:05:36.912358 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:36 crc kubenswrapper[4747]: E1205 21:05:36.912425 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts podName:bcb08da7-0b5d-437e-8882-4d352f3c2d47 nodeName:}" failed. No retries permitted until 2025-12-05 21:06:08.91240803 +0000 UTC m=+1439.379715518 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts") pod "novaapi48b7-account-delete-25wjk" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47") : configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.013890 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.013998 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.014535 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts podName:6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a nodeName:}" failed. No retries permitted until 2025-12-05 21:06:09.014459142 +0000 UTC m=+1439.481766630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts") pod "neutron7680-account-delete-f82hm" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a") : configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.014554 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts podName:51241ef0-cffe-40ba-bea9-1c01b7cc4184 nodeName:}" failed. No retries permitted until 2025-12-05 21:06:09.014544054 +0000 UTC m=+1439.481851542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts") pod "novacell033f5-account-delete-dk9n6" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184") : configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.115679 4747 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.115762 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts podName:67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0 nodeName:}" failed. No retries permitted until 2025-12-05 21:06:09.115745045 +0000 UTC m=+1439.583052533 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts") pod "placement8da4-account-delete-8nv8m" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0") : configmap "openstack-scripts" not found Dec 05 21:05:37 crc kubenswrapper[4747]: E1205 21:05:37.796210 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/system-systemd\\\\x2dcoredump.slice/systemd-coredump@0-70912-0.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51241ef0_cffe_40ba_bea9_1c01b7cc4184.slice/crio-conmon-d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29.scope\": RecentStats: unable to find data in memory cache]" Dec 05 21:05:37 crc kubenswrapper[4747]: I1205 21:05:37.851895 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.029335 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pwv\" (UniqueName: \"kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv\") pod \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.029859 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts\") pod \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\" (UID: \"51241ef0-cffe-40ba-bea9-1c01b7cc4184\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.031085 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51241ef0-cffe-40ba-bea9-1c01b7cc4184" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.033931 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv" (OuterVolumeSpecName: "kube-api-access-c2pwv") pod "51241ef0-cffe-40ba-bea9-1c01b7cc4184" (UID: "51241ef0-cffe-40ba-bea9-1c01b7cc4184"). InnerVolumeSpecName "kube-api-access-c2pwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.040717 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.045858 4747 generic.go:334] "Generic (PLEG): container finished" podID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" containerID="cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d" exitCode=137 Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.045889 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7680-account-delete-f82hm" event={"ID":"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a","Type":"ContainerDied","Data":"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.045924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7680-account-delete-f82hm" event={"ID":"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a","Type":"ContainerDied","Data":"fc7c312259b26c8c83bf7ae119aa88af38f2954e57c3a4d553391b924f68bc6d"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.045946 4747 scope.go:117] "RemoveContainer" containerID="cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.045926 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7680-account-delete-f82hm" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.048560 4747 generic.go:334] "Generic (PLEG): container finished" podID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" containerID="bc88be3138ad806a1fad25c0703151fdaa867a84929b19340a4803a430b8c8b6" exitCode=137 Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.048662 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.048662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement8da4-account-delete-8nv8m" event={"ID":"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0","Type":"ContainerDied","Data":"bc88be3138ad806a1fad25c0703151fdaa867a84929b19340a4803a430b8c8b6"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.048721 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement8da4-account-delete-8nv8m" event={"ID":"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0","Type":"ContainerDied","Data":"29e991cbd1540f9f03f9daca034528d86992204b17c2accb6807748e5e7a771b"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.048738 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29e991cbd1540f9f03f9daca034528d86992204b17c2accb6807748e5e7a771b" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.051937 4747 generic.go:334] "Generic (PLEG): container finished" podID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" containerID="d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29" exitCode=137 Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.052009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell033f5-account-delete-dk9n6" event={"ID":"51241ef0-cffe-40ba-bea9-1c01b7cc4184","Type":"ContainerDied","Data":"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.052038 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell033f5-account-delete-dk9n6" event={"ID":"51241ef0-cffe-40ba-bea9-1c01b7cc4184","Type":"ContainerDied","Data":"a01582f4fdb4884d9667fdc3e8297aca14e43053509280408fb83e29abd93aed"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.052087 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell033f5-account-delete-dk9n6" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.061258 4747 generic.go:334] "Generic (PLEG): container finished" podID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" containerID="b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9" exitCode=137 Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.061304 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi48b7-account-delete-25wjk" event={"ID":"bcb08da7-0b5d-437e-8882-4d352f3c2d47","Type":"ContainerDied","Data":"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.061358 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi48b7-account-delete-25wjk" event={"ID":"bcb08da7-0b5d-437e-8882-4d352f3c2d47","Type":"ContainerDied","Data":"0ef88f0d042cdeeced868c538d16dd641308c3e7831857605543318252f517de"} Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.061427 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi48b7-account-delete-25wjk" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.065805 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.073501 4747 scope.go:117] "RemoveContainer" containerID="cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d" Dec 05 21:05:38 crc kubenswrapper[4747]: E1205 21:05:38.073962 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d\": container with ID starting with cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d not found: ID does not exist" containerID="cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.074002 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d"} err="failed to get container status \"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d\": rpc error: code = NotFound desc = could not find container \"cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d\": container with ID starting with cafef2d2e736d3f1b68d75130f00554b04edc034c17ce240b5d3cb8bd345520d not found: ID does not exist" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.074029 4747 scope.go:117] "RemoveContainer" containerID="d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.105637 4747 scope.go:117] "RemoveContainer" containerID="d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29" Dec 05 21:05:38 crc kubenswrapper[4747]: E1205 21:05:38.106718 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29\": container with ID starting with d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29 not found: ID does not exist" containerID="d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.106811 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29"} err="failed to get container status \"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29\": rpc error: code = NotFound desc = could not find container \"d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29\": container with ID starting with d2565391e687a35ba3735f19333fba1cc605e1fa9021ef94582bdc7581618a29 not found: ID does not exist" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.106832 4747 scope.go:117] "RemoveContainer" containerID="b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.110418 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.116759 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell033f5-account-delete-dk9n6"] Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.126543 4747 scope.go:117] "RemoveContainer" containerID="b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9" Dec 05 21:05:38 crc kubenswrapper[4747]: E1205 21:05:38.126989 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9\": container with ID starting with b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9 not found: ID does not exist" containerID="b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.127025 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9"} err="failed to get container status \"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9\": rpc error: code = NotFound desc = could not find container \"b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9\": container with ID starting with b0466ca59fc3ca16b0b944d473918bba58affb8a8c5a9027752be5d97b5355b9 not found: ID does not exist" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.131836 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pwv\" (UniqueName: \"kubernetes.io/projected/51241ef0-cffe-40ba-bea9-1c01b7cc4184-kube-api-access-c2pwv\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.131858 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51241ef0-cffe-40ba-bea9-1c01b7cc4184-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.232941 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts\") pod \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233036 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47sc2\" (UniqueName: \"kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2\") pod \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\" (UID: \"bcb08da7-0b5d-437e-8882-4d352f3c2d47\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233088 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts\") pod \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233135 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts\") pod \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233159 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpq9n\" (UniqueName: \"kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n\") pod \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\" (UID: \"67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233183 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7cl\" (UniqueName: \"kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl\") pod \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\" (UID: \"6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a\") " Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233484 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcb08da7-0b5d-437e-8882-4d352f3c2d47" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233888 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.233945 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.235598 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2" (OuterVolumeSpecName: "kube-api-access-47sc2") pod "bcb08da7-0b5d-437e-8882-4d352f3c2d47" (UID: "bcb08da7-0b5d-437e-8882-4d352f3c2d47"). InnerVolumeSpecName "kube-api-access-47sc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.235817 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl" (OuterVolumeSpecName: "kube-api-access-dl7cl") pod "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" (UID: "6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a"). InnerVolumeSpecName "kube-api-access-dl7cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.236558 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n" (OuterVolumeSpecName: "kube-api-access-vpq9n") pod "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" (UID: "67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0"). InnerVolumeSpecName "kube-api-access-vpq9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335183 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb08da7-0b5d-437e-8882-4d352f3c2d47-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335243 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47sc2\" (UniqueName: \"kubernetes.io/projected/bcb08da7-0b5d-437e-8882-4d352f3c2d47-kube-api-access-47sc2\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335267 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335284 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335310 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpq9n\" (UniqueName: \"kubernetes.io/projected/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0-kube-api-access-vpq9n\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.335338 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7cl\" (UniqueName: \"kubernetes.io/projected/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a-kube-api-access-dl7cl\") on node \"crc\" DevicePath \"\"" Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.397934 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.405391 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron7680-account-delete-f82hm"] Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.416901 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:38 crc kubenswrapper[4747]: I1205 21:05:38.421308 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi48b7-account-delete-25wjk"] Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.078050 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement8da4-account-delete-8nv8m" Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.131069 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.140929 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement8da4-account-delete-8nv8m"] Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.850352 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" path="/var/lib/kubelet/pods/51241ef0-cffe-40ba-bea9-1c01b7cc4184/volumes" Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.851093 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" path="/var/lib/kubelet/pods/67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0/volumes" Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.852152 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" path="/var/lib/kubelet/pods/6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a/volumes" Dec 05 21:05:39 crc kubenswrapper[4747]: I1205 21:05:39.852845 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" path="/var/lib/kubelet/pods/bcb08da7-0b5d-437e-8882-4d352f3c2d47/volumes" Dec 05 21:05:48 crc kubenswrapper[4747]: E1205 21:05:48.037823 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/system-systemd\\\\x2dcoredump.slice/systemd-coredump@0-70912-0.service\": RecentStats: unable to find data in memory cache]" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.269473 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.272986 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.273193 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.273389 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.273525 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-server" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.273695 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722451d6-f405-4bdc-b418-bfdaa2000197" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.273845 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="722451d6-f405-4bdc-b418-bfdaa2000197" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.273990 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.274110 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.274244 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.274354 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.274490 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60728349-9aa2-474c-b935-6fc915822d4e" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.274644 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="60728349-9aa2-474c-b935-6fc915822d4e" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.274803 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.274950 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.275115 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.275272 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.275470 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="ovsdbserver-sb" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.275658 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="ovsdbserver-sb" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.275838 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.275973 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.276097 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-central-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.276228 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-central-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.276366 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.276475 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.276627 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.276750 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.276868 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="sg-core" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.276975 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="sg-core" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.277105 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" containerName="memcached" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.277254 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" containerName="memcached" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.277420 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.277574 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.277757 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerName="nova-cell1-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.277870 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerName="nova-cell1-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.277987 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerName="nova-scheduler-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.278094 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerName="nova-scheduler-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.278231 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.278363 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.278484 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.278628 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.278763 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.278873 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-server" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.278985 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.279852 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.279984 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.280096 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.280235 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="init" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.280357 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="init" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.280477 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.280628 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.280765 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.280876 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.280994 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="setup-container" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.281104 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="setup-container" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.281236 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="probe" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.281361 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="probe" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.281475 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.281615 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.281752 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="mysql-bootstrap" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.281863 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="mysql-bootstrap" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.281974 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.282083 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.282227 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.282351 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.282465 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="dnsmasq-dns" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.282573 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="dnsmasq-dns" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.282733 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.282845 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.282966 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.283075 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.283202 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="ovsdbserver-nb" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.283328 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="ovsdbserver-nb" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.283478 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="rsync" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.283639 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="rsync" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.283772 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.283884 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.284006 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-expirer" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.284116 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-expirer" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.284254 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.284372 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.284496 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.284640 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.284770 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="ovn-northd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.284887 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="ovn-northd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.284997 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="mysql-bootstrap" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.285111 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="mysql-bootstrap" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.285246 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.285355 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.285469 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="swift-recon-cron" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.285578 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="swift-recon-cron" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.285734 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.285850 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.285957 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.286062 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-api" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.286189 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.286329 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.286477 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.286643 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.286754 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.286871 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-api" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.286982 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.287089 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.287214 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.287376 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.287497 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.287699 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.287828 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.287937 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.288040 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.288149 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.288301 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server-init" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.288429 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server-init" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.288547 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="setup-container" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.288702 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="setup-container" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.288817 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.288924 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.289093 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.289318 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.289564 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.289777 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-server" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.290021 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.290306 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-server" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.290574 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.290838 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.291059 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.291430 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker-log" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.291859 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.292027 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-api" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.292194 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278246ea-04b9-4694-bb8c-5c67503c17e4" containerName="kube-state-metrics" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.292367 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="278246ea-04b9-4694-bb8c-5c67503c17e4" containerName="kube-state-metrics" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.292519 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.292695 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.292852 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.292999 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.293151 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.293325 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.293487 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.293646 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.293810 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.293956 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.294107 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.294285 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.294448 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-notification-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.294608 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-notification-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.294777 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="cinder-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.294924 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="cinder-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.295072 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.295737 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.295924 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-reaper" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.296079 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-reaper" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.296344 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.296455 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.296776 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.296885 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: E1205 21:05:56.297012 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be50198-f9c5-4e90-bfa0-d33b502278b7" containerName="keystone-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.297120 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be50198-f9c5-4e90-bfa0-d33b502278b7" containerName="keystone-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.297662 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.297832 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.297991 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3c569a-33c9-46a0-8461-e69315fbd20b" containerName="barbican-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298096 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="probe" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298193 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d9d1c5-1e8a-42b0-b9c9-2fad64e94af0" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298291 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc768ee-26e3-43fb-b64d-7fc3f3c627c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298393 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="ovsdbserver-nb" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298470 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee758f70-0c00-471e-85bf-2d4a96646d15" containerName="nova-cell1-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298561 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298671 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298796 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.298894 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299084 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovs-vswitchd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299210 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c51dab-4948-4ca3-94ba-c25cb3a4e280" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299308 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299431 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-central-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299533 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7528af7d-8d58-4f20-ac03-b9b62e14c9e2" containerName="ovn-northd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299761 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf40d0cc-0d14-4c55-986d-2809df27c4fd" containerName="dnsmasq-dns" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.299875 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae4d823-7941-4e9e-ae9d-cce0297e278d" containerName="nova-cell0-conductor-conductor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300005 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d610aeb-7ead-4ea7-ae34-e8f9c2a14c6a" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300109 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300208 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d350f3b-2497-4941-a006-84a503604020" containerName="barbican-keystone-listener-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300298 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300434 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300681 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49bd09d-90af-4f00-a333-0e292c581525" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300821 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="rsync" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.300958 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301056 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="722451d6-f405-4bdc-b418-bfdaa2000197" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301156 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-expirer" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301268 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="684a964a-1f18-4648-b8c7-6c8c818e16bd" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301356 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f34a29-aa43-46a7-9fa8-8eead0c7ba07" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301429 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb08da7-0b5d-437e-8882-4d352f3c2d47" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301500 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301569 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="60728349-9aa2-474c-b935-6fc915822d4e" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301709 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301787 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301884 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.301967 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="075d1135-1337-43d3-83e1-97b942a03786" containerName="cinder-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302038 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d835978-9804-4427-9dd4-48c40ad829c5" containerName="openstack-network-exporter" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302110 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302197 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f0706f-507e-4360-b83e-9dbfacee3144" containerName="galera" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302275 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="ceilometer-notification-agent" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302347 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302418 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="51241ef0-cffe-40ba-bea9-1c01b7cc4184" containerName="mariadb-account-delete" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302491 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-reaper" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302560 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302652 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c42c4c8d-8b84-479a-b4f8-7ec6fcd58ba4" containerName="ovn-controller" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302764 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-updater" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.302949 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303032 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="278246ea-04b9-4694-bb8c-5c67503c17e4" containerName="kube-state-metrics" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303107 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b545cc7-5bb8-499c-83da-93fe7cbbd6f9" containerName="nova-scheduler-scheduler" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303191 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="container-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303270 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303366 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="466afd16-6e7d-42fd-bd82-cabab660b344" containerName="placement-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303440 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e0ca3b-083d-477b-a227-dc70e5204444" containerName="memcached" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303516 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="70db507e-cc84-4722-8ac8-fd659c2803b8" containerName="rabbitmq" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303617 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="62399e6a-577a-4d10-b057-49c4bae7a172" containerName="cinder-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303702 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37900c0-058e-4be6-9b81-c67d5f05b7a5" containerName="ovsdb-server" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303781 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc30174-a5ca-454e-b049-59a62d358d8a" containerName="ovsdbserver-sb" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303848 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5bfe363-0d37-4380-93a8-dc7ea1ad3392" containerName="proxy-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.303948 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-replicator" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304027 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="object-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304101 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c19006-e7b7-4c36-847a-a52358ae6a99" containerName="nova-api-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304185 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be50198-f9c5-4e90-bfa0-d33b502278b7" containerName="keystone-api" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304267 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="swift-recon-cron" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304286 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da10f9e-c9d3-4d5e-888a-774080f417f8" containerName="sg-core" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304294 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac080b76-32d3-498c-9832-d31494c1d21f" containerName="glance-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304303 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e4dab2-e2a7-4fe3-949a-6a31460e11ba" containerName="glance-log" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304316 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="70527d7e-feb4-4821-b20d-74d9634ab124" containerName="nova-metadata-metadata" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304324 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06375c-a008-4d1f-b2ae-516549bcd438" containerName="account-auditor" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304336 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ded45d9-c7e1-4429-982b-4f7c10e43473" containerName="neutron-httpd" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.304348 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b073c9-c8ff-4de9-b2ee-605d37c94ffb" containerName="barbican-worker" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.306397 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.306560 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.411895 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8mm\" (UniqueName: \"kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.411981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.412009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.513024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.513064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.513151 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8mm\" (UniqueName: \"kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.513685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.513784 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.536875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8mm\" (UniqueName: \"kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm\") pod \"redhat-operators-56z55\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:56 crc kubenswrapper[4747]: I1205 21:05:56.631783 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:05:57 crc kubenswrapper[4747]: I1205 21:05:57.068239 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:05:57 crc kubenswrapper[4747]: I1205 21:05:57.314034 4747 generic.go:334] "Generic (PLEG): container finished" podID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerID="93af9c74179d5654e288c9d89b2297e73bc6776ee447b3f1aeab886dfc7afef8" exitCode=0 Dec 05 21:05:57 crc kubenswrapper[4747]: I1205 21:05:57.314097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerDied","Data":"93af9c74179d5654e288c9d89b2297e73bc6776ee447b3f1aeab886dfc7afef8"} Dec 05 21:05:57 crc kubenswrapper[4747]: I1205 21:05:57.314154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerStarted","Data":"fd8068b64174fe37374c7422bce048ff9709b9e5614f072c8f2ae07d1830600c"} Dec 05 21:05:57 crc kubenswrapper[4747]: I1205 21:05:57.315789 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:05:58 crc kubenswrapper[4747]: E1205 21:05:58.257753 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/system-systemd\\\\x2dcoredump.slice/systemd-coredump@0-70912-0.service\": RecentStats: unable to find data in memory cache]" Dec 05 21:05:58 crc kubenswrapper[4747]: I1205 21:05:58.328460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerStarted","Data":"fc2c80c6f40ab6666254e8cadad362df7fe1e8d99e025d6c4daf4136a28b0710"} Dec 05 21:05:59 crc kubenswrapper[4747]: I1205 21:05:59.343672 4747 generic.go:334] "Generic (PLEG): container finished" podID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerID="fc2c80c6f40ab6666254e8cadad362df7fe1e8d99e025d6c4daf4136a28b0710" exitCode=0 Dec 05 21:05:59 crc kubenswrapper[4747]: I1205 21:05:59.343752 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerDied","Data":"fc2c80c6f40ab6666254e8cadad362df7fe1e8d99e025d6c4daf4136a28b0710"} Dec 05 21:06:00 crc kubenswrapper[4747]: I1205 21:06:00.355945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerStarted","Data":"bc5abafd65dc59ca43607ed165c4b8c7f315bcce2e149b643cbf0f370fb0b532"} Dec 05 21:06:06 crc kubenswrapper[4747]: I1205 21:06:06.632902 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:06 crc kubenswrapper[4747]: I1205 21:06:06.633521 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:06 crc kubenswrapper[4747]: I1205 21:06:06.696290 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:06 crc kubenswrapper[4747]: I1205 21:06:06.723196 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56z55" podStartSLOduration=8.222602815 podStartE2EDuration="10.723178811s" podCreationTimestamp="2025-12-05 21:05:56 +0000 UTC" firstStartedPulling="2025-12-05 21:05:57.315551176 +0000 UTC m=+1427.782858674" lastFinishedPulling="2025-12-05 21:05:59.816127172 +0000 UTC m=+1430.283434670" observedRunningTime="2025-12-05 21:06:00.378710479 +0000 UTC m=+1430.846017967" watchObservedRunningTime="2025-12-05 21:06:06.723178811 +0000 UTC m=+1437.190486299" Dec 05 21:06:07 crc kubenswrapper[4747]: I1205 21:06:07.469303 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:07 crc kubenswrapper[4747]: I1205 21:06:07.522822 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:06:08 crc kubenswrapper[4747]: E1205 21:06:08.550533 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/system-systemd\\\\x2dcoredump.slice/systemd-coredump@0-70912-0.service\": RecentStats: unable to find data in memory cache]" Dec 05 21:06:09 crc kubenswrapper[4747]: I1205 21:06:09.439934 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56z55" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="registry-server" containerID="cri-o://bc5abafd65dc59ca43607ed165c4b8c7f315bcce2e149b643cbf0f370fb0b532" gracePeriod=2 Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.461455 4747 generic.go:334] "Generic (PLEG): container finished" podID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerID="bc5abafd65dc59ca43607ed165c4b8c7f315bcce2e149b643cbf0f370fb0b532" exitCode=0 Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.461561 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerDied","Data":"bc5abafd65dc59ca43607ed165c4b8c7f315bcce2e149b643cbf0f370fb0b532"} Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.695286 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.731529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities\") pod \"67ec16c2-b60c-432e-9e82-23d1e838993a\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.731642 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content\") pod \"67ec16c2-b60c-432e-9e82-23d1e838993a\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.731695 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp8mm\" (UniqueName: \"kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm\") pod \"67ec16c2-b60c-432e-9e82-23d1e838993a\" (UID: \"67ec16c2-b60c-432e-9e82-23d1e838993a\") " Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.732572 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities" (OuterVolumeSpecName: "utilities") pod "67ec16c2-b60c-432e-9e82-23d1e838993a" (UID: "67ec16c2-b60c-432e-9e82-23d1e838993a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.737152 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm" (OuterVolumeSpecName: "kube-api-access-tp8mm") pod "67ec16c2-b60c-432e-9e82-23d1e838993a" (UID: "67ec16c2-b60c-432e-9e82-23d1e838993a"). InnerVolumeSpecName "kube-api-access-tp8mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.833524 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.833567 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp8mm\" (UniqueName: \"kubernetes.io/projected/67ec16c2-b60c-432e-9e82-23d1e838993a-kube-api-access-tp8mm\") on node \"crc\" DevicePath \"\"" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.845799 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67ec16c2-b60c-432e-9e82-23d1e838993a" (UID: "67ec16c2-b60c-432e-9e82-23d1e838993a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:06:11 crc kubenswrapper[4747]: I1205 21:06:11.935639 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ec16c2-b60c-432e-9e82-23d1e838993a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.476172 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56z55" event={"ID":"67ec16c2-b60c-432e-9e82-23d1e838993a","Type":"ContainerDied","Data":"fd8068b64174fe37374c7422bce048ff9709b9e5614f072c8f2ae07d1830600c"} Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.476255 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56z55" Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.476553 4747 scope.go:117] "RemoveContainer" containerID="bc5abafd65dc59ca43607ed165c4b8c7f315bcce2e149b643cbf0f370fb0b532" Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.508263 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.517275 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56z55"] Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.518574 4747 scope.go:117] "RemoveContainer" containerID="fc2c80c6f40ab6666254e8cadad362df7fe1e8d99e025d6c4daf4136a28b0710" Dec 05 21:06:12 crc kubenswrapper[4747]: I1205 21:06:12.542396 4747 scope.go:117] "RemoveContainer" containerID="93af9c74179d5654e288c9d89b2297e73bc6776ee447b3f1aeab886dfc7afef8" Dec 05 21:06:13 crc kubenswrapper[4747]: I1205 21:06:13.849745 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" path="/var/lib/kubelet/pods/67ec16c2-b60c-432e-9e82-23d1e838993a/volumes" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.766963 4747 scope.go:117] "RemoveContainer" containerID="3e2e501bf6db7407c56844788f8bbd62d9e8a7a9e7605f4be76cc81c55d45534" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.806388 4747 scope.go:117] "RemoveContainer" containerID="96d9fe9346adceba236ef7d628d470fe843b2a68dfb35a74d51aca199aa5b567" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.834188 4747 scope.go:117] "RemoveContainer" containerID="a9dcbb54d496caf2f805d60df04f610d71d46e34c98b40703cbee95f6436ad6a" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.873437 4747 scope.go:117] "RemoveContainer" containerID="0e2fa2cf0521e9e0a9ee440e9496bbd60e565938c9bcb968eb76b83f8a5212ad" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.905987 4747 scope.go:117] "RemoveContainer" containerID="115068073f7e5ab9a48b744498913c1f4b59e9db0badfecc60d328bacfbf5cce" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.924656 4747 scope.go:117] "RemoveContainer" containerID="685822c02c1e6e5a3b73c537d25e8ed5863590f2bb1545fc0be7b2bc201035b1" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.962107 4747 scope.go:117] "RemoveContainer" containerID="22f61abdae53ffc00d95bab85332c82fb57990e50c5f80a9440f9d18e16c1e05" Dec 05 21:06:23 crc kubenswrapper[4747]: I1205 21:06:23.990789 4747 scope.go:117] "RemoveContainer" containerID="264a34a52b2bcb52aa5db288abbbf43e65fd0f92091e50d839d9c4ffaab07034" Dec 05 21:06:24 crc kubenswrapper[4747]: I1205 21:06:24.027916 4747 scope.go:117] "RemoveContainer" containerID="c2b5ad7c71ac902cf882c6a823d0b022b018b3f0655877bb3292fa03d68e3bae" Dec 05 21:06:24 crc kubenswrapper[4747]: I1205 21:06:24.046701 4747 scope.go:117] "RemoveContainer" containerID="3969d964903947e872ffe0ce65d01ff896775d4ceda5953edf46cb6db636dad2" Dec 05 21:06:24 crc kubenswrapper[4747]: I1205 21:06:24.068145 4747 scope.go:117] "RemoveContainer" containerID="3e8f9888f877f5cb02adb80d4dc39e181f6555ae1d98bbb6455a49dab7ae1f6d" Dec 05 21:07:06 crc kubenswrapper[4747]: I1205 21:07:06.222623 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:07:06 crc kubenswrapper[4747]: I1205 21:07:06.223235 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.095107 4747 scope.go:117] "RemoveContainer" containerID="fc1cfe778171e50c163dcbd11b2d658d1e40a3c9a296c976c35359ea06c5a66f" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.130981 4747 scope.go:117] "RemoveContainer" containerID="c6d47481e8ef33252b8085dc229cc7562124000159fd1e4d87c701691429a481" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.181377 4747 scope.go:117] "RemoveContainer" containerID="8a9220e2f263274b5d6646c5ced8d29f995f12701ac056b9861cdd18a54f6738" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.231796 4747 scope.go:117] "RemoveContainer" containerID="9ba405dde9a6a2b26ae6aef73ad9fe3e643c2c781adfc8e4cf982e48219da231" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.258460 4747 scope.go:117] "RemoveContainer" containerID="d572e4ae1c2bdcc8633d6f7e2500cd6bc1b99b0d1f2ef1d0e1c9d681f18656f7" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.286394 4747 scope.go:117] "RemoveContainer" containerID="970e2b8c6159ee24f6e1d7c7a385fd9c230a5faf9abb45c253b8c897d74ea335" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.327922 4747 scope.go:117] "RemoveContainer" containerID="6146e8593917a68bb0ab86fcf38800fdf5cda3e75eb80fe9775e293dc1aa808c" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.346751 4747 scope.go:117] "RemoveContainer" containerID="2c616242a09a02b14af1a770510f774577f4438e787989439ca44def0684c811" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.364693 4747 scope.go:117] "RemoveContainer" containerID="a362d9fb473592bb60d9373f2630d4ac929dccc8b2ff2dc397dd513565153b24" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.386172 4747 scope.go:117] "RemoveContainer" containerID="6c0a4222132d0fff66a7dafaa465590cd04be9365f738b2b4a9e907cda75ff38" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.415893 4747 scope.go:117] "RemoveContainer" containerID="c6fde9291ef4b4350a70617293ff58ef6c3ec3ffff2a74a8701617ae895aa21e" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.431972 4747 scope.go:117] "RemoveContainer" containerID="5edad227b6c8fed3039bc0bf322c39145fe67edd7b89c67e006cbbd3ef396f71" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.462544 4747 scope.go:117] "RemoveContainer" containerID="86aab3d671c993418b87c4d45af8f7bf90b32d71794e98d95c6f839b0889ccf3" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.508170 4747 scope.go:117] "RemoveContainer" containerID="25bb358b728d1ef1998976e60e897459937e0384dfec3ecb00006646b0a949a1" Dec 05 21:07:25 crc kubenswrapper[4747]: I1205 21:07:25.548364 4747 scope.go:117] "RemoveContainer" containerID="267eaf4271e17858acded1d97dd340e7340fdbc2118868acb48e221d78e8389e" Dec 05 21:07:36 crc kubenswrapper[4747]: I1205 21:07:36.221857 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:07:36 crc kubenswrapper[4747]: I1205 21:07:36.222373 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.345851 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:38 crc kubenswrapper[4747]: E1205 21:07:38.346208 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="registry-server" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.346224 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="registry-server" Dec 05 21:07:38 crc kubenswrapper[4747]: E1205 21:07:38.346248 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="extract-content" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.346257 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="extract-content" Dec 05 21:07:38 crc kubenswrapper[4747]: E1205 21:07:38.346275 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="extract-utilities" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.346283 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="extract-utilities" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.346492 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ec16c2-b60c-432e-9e82-23d1e838993a" containerName="registry-server" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.347819 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.355985 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.496213 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.496649 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.496726 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddx85\" (UniqueName: \"kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.598300 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddx85\" (UniqueName: \"kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.598380 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.598424 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.598926 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.599441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.616956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddx85\" (UniqueName: \"kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85\") pod \"community-operators-w9xzv\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:38 crc kubenswrapper[4747]: I1205 21:07:38.705870 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:39 crc kubenswrapper[4747]: I1205 21:07:39.222534 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:39 crc kubenswrapper[4747]: I1205 21:07:39.280389 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerStarted","Data":"b3451e1f75f1908a59c58426a43ce4d937c9d146242703d75397c45a182e35b4"} Dec 05 21:07:40 crc kubenswrapper[4747]: I1205 21:07:40.303682 4747 generic.go:334] "Generic (PLEG): container finished" podID="ad540f7b-11dc-484c-af97-176be9046540" containerID="11a880306f5503b5ee78907947a91dd47d41aaf2b44c710e06905f15f1c25de4" exitCode=0 Dec 05 21:07:40 crc kubenswrapper[4747]: I1205 21:07:40.303763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerDied","Data":"11a880306f5503b5ee78907947a91dd47d41aaf2b44c710e06905f15f1c25de4"} Dec 05 21:07:41 crc kubenswrapper[4747]: I1205 21:07:41.314640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerStarted","Data":"ee1ab4798a7083d057cb05ddd6ed4c7d76f0ddbaff58c30fe9b77bb32e0e185f"} Dec 05 21:07:42 crc kubenswrapper[4747]: I1205 21:07:42.325714 4747 generic.go:334] "Generic (PLEG): container finished" podID="ad540f7b-11dc-484c-af97-176be9046540" containerID="ee1ab4798a7083d057cb05ddd6ed4c7d76f0ddbaff58c30fe9b77bb32e0e185f" exitCode=0 Dec 05 21:07:42 crc kubenswrapper[4747]: I1205 21:07:42.325751 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerDied","Data":"ee1ab4798a7083d057cb05ddd6ed4c7d76f0ddbaff58c30fe9b77bb32e0e185f"} Dec 05 21:07:43 crc kubenswrapper[4747]: I1205 21:07:43.333441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerStarted","Data":"7bec91f5b3abb55830e96d1d3ea7a7bf8ed18b57d70ba11f5b46c49c4c8de532"} Dec 05 21:07:43 crc kubenswrapper[4747]: I1205 21:07:43.351030 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w9xzv" podStartSLOduration=2.7941625979999998 podStartE2EDuration="5.351013292s" podCreationTimestamp="2025-12-05 21:07:38 +0000 UTC" firstStartedPulling="2025-12-05 21:07:40.305220934 +0000 UTC m=+1530.772528442" lastFinishedPulling="2025-12-05 21:07:42.862071638 +0000 UTC m=+1533.329379136" observedRunningTime="2025-12-05 21:07:43.350053848 +0000 UTC m=+1533.817361336" watchObservedRunningTime="2025-12-05 21:07:43.351013292 +0000 UTC m=+1533.818320780" Dec 05 21:07:48 crc kubenswrapper[4747]: I1205 21:07:48.706147 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:48 crc kubenswrapper[4747]: I1205 21:07:48.706219 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:48 crc kubenswrapper[4747]: I1205 21:07:48.769032 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:49 crc kubenswrapper[4747]: I1205 21:07:49.466115 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:49 crc kubenswrapper[4747]: I1205 21:07:49.524692 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:51 crc kubenswrapper[4747]: I1205 21:07:51.412303 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w9xzv" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="registry-server" containerID="cri-o://7bec91f5b3abb55830e96d1d3ea7a7bf8ed18b57d70ba11f5b46c49c4c8de532" gracePeriod=2 Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.422270 4747 generic.go:334] "Generic (PLEG): container finished" podID="ad540f7b-11dc-484c-af97-176be9046540" containerID="7bec91f5b3abb55830e96d1d3ea7a7bf8ed18b57d70ba11f5b46c49c4c8de532" exitCode=0 Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.422674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerDied","Data":"7bec91f5b3abb55830e96d1d3ea7a7bf8ed18b57d70ba11f5b46c49c4c8de532"} Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.422747 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w9xzv" event={"ID":"ad540f7b-11dc-484c-af97-176be9046540","Type":"ContainerDied","Data":"b3451e1f75f1908a59c58426a43ce4d937c9d146242703d75397c45a182e35b4"} Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.422764 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3451e1f75f1908a59c58426a43ce4d937c9d146242703d75397c45a182e35b4" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.450852 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.609677 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities\") pod \"ad540f7b-11dc-484c-af97-176be9046540\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.609759 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddx85\" (UniqueName: \"kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85\") pod \"ad540f7b-11dc-484c-af97-176be9046540\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.610827 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content\") pod \"ad540f7b-11dc-484c-af97-176be9046540\" (UID: \"ad540f7b-11dc-484c-af97-176be9046540\") " Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.611698 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities" (OuterVolumeSpecName: "utilities") pod "ad540f7b-11dc-484c-af97-176be9046540" (UID: "ad540f7b-11dc-484c-af97-176be9046540"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.618206 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85" (OuterVolumeSpecName: "kube-api-access-ddx85") pod "ad540f7b-11dc-484c-af97-176be9046540" (UID: "ad540f7b-11dc-484c-af97-176be9046540"). InnerVolumeSpecName "kube-api-access-ddx85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.710454 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad540f7b-11dc-484c-af97-176be9046540" (UID: "ad540f7b-11dc-484c-af97-176be9046540"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.713096 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.713240 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddx85\" (UniqueName: \"kubernetes.io/projected/ad540f7b-11dc-484c-af97-176be9046540-kube-api-access-ddx85\") on node \"crc\" DevicePath \"\"" Dec 05 21:07:52 crc kubenswrapper[4747]: I1205 21:07:52.713360 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad540f7b-11dc-484c-af97-176be9046540-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:07:53 crc kubenswrapper[4747]: I1205 21:07:53.431546 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w9xzv" Dec 05 21:07:53 crc kubenswrapper[4747]: I1205 21:07:53.494518 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:53 crc kubenswrapper[4747]: I1205 21:07:53.506156 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w9xzv"] Dec 05 21:07:53 crc kubenswrapper[4747]: I1205 21:07:53.851081 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad540f7b-11dc-484c-af97-176be9046540" path="/var/lib/kubelet/pods/ad540f7b-11dc-484c-af97-176be9046540/volumes" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.222207 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.222781 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.222831 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.223461 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.223528 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" gracePeriod=600 Dec 05 21:08:06 crc kubenswrapper[4747]: E1205 21:08:06.351354 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.563159 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" exitCode=0 Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.563232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1"} Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.563289 4747 scope.go:117] "RemoveContainer" containerID="27a8861ae720657d5c1aabed46d192906d0631fb9e46de40cae1199d706d1642" Dec 05 21:08:06 crc kubenswrapper[4747]: I1205 21:08:06.564228 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:08:06 crc kubenswrapper[4747]: E1205 21:08:06.564724 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:08:19 crc kubenswrapper[4747]: I1205 21:08:19.850158 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:08:19 crc kubenswrapper[4747]: E1205 21:08:19.851952 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.777199 4747 scope.go:117] "RemoveContainer" containerID="579b667a2f6822473f1c5251f59efb929a78e15bbb88f0ffcdc18baa8dad0012" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.833497 4747 scope.go:117] "RemoveContainer" containerID="0c9cfa29c3405f6a5527f6d5e246534de9a5f12a953fd035914d4286c56ce569" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.860468 4747 scope.go:117] "RemoveContainer" containerID="79c9846b3460b979f45bb6dd58d4e45c8f431c03eb72863e97310a16cb1f51bc" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.885897 4747 scope.go:117] "RemoveContainer" containerID="19016e2b01b2316bb76c719117800a4ec68aa6923aec37cad94c18d149837111" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.909437 4747 scope.go:117] "RemoveContainer" containerID="48d16915b14bf258e2e2ea0934a538dc18c8aaa5a1ad5fd7f78335e7afa917f1" Dec 05 21:08:25 crc kubenswrapper[4747]: I1205 21:08:25.936077 4747 scope.go:117] "RemoveContainer" containerID="c35bd7af758906f50a3ef95441de063d132d5f4fbd1e409678069723a5f290ad" Dec 05 21:08:33 crc kubenswrapper[4747]: I1205 21:08:33.840789 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:08:33 crc kubenswrapper[4747]: E1205 21:08:33.841892 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:08:48 crc kubenswrapper[4747]: I1205 21:08:48.840039 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:08:48 crc kubenswrapper[4747]: E1205 21:08:48.841172 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:08:59 crc kubenswrapper[4747]: I1205 21:08:59.847713 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:08:59 crc kubenswrapper[4747]: E1205 21:08:59.849146 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:09:14 crc kubenswrapper[4747]: I1205 21:09:14.840366 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:09:14 crc kubenswrapper[4747]: E1205 21:09:14.841300 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:09:25 crc kubenswrapper[4747]: I1205 21:09:25.839708 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:09:25 crc kubenswrapper[4747]: E1205 21:09:25.840654 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.043396 4747 scope.go:117] "RemoveContainer" containerID="2e6c7e067481881eb3c243ecc0d7be3e0fbce9cebb2acae99db2d255388c3986" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.084344 4747 scope.go:117] "RemoveContainer" containerID="1e4c9c1343b7df17852b0ace458ea0a391b18fabd8537189c0bbb26e329e1df8" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.101685 4747 scope.go:117] "RemoveContainer" containerID="021b4129bc60dba22de412a95d497fa278c1cabcb4adaba919eabf389ed7bba3" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.138407 4747 scope.go:117] "RemoveContainer" containerID="9ed4c66fc18065842866bdc3c5c5f0593f188b8cf62436b51215923e39dceb01" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.159881 4747 scope.go:117] "RemoveContainer" containerID="5cd4b38c507e83ae7d637b40662cb44e4fd8285a5ed0dc4193f6d009e999b136" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.210735 4747 scope.go:117] "RemoveContainer" containerID="2ff2feb07b826f2c716adfd6ef299090c2fb91ca6c9b21907acafcff5307d232" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.226802 4747 scope.go:117] "RemoveContainer" containerID="4c1d5fe8c1b11df6b6aa3ce3695865857e2c986a0baadfd8120d77e27f5e70f6" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.242030 4747 scope.go:117] "RemoveContainer" containerID="da248c5f3b792f07ddf65be7ca9f5b394ff47e04718a63c6e5f4984caebb8672" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.273670 4747 scope.go:117] "RemoveContainer" containerID="90e5436f1599ff19ca17733065aa9fa10230ec3903a4c6e14e428cc0e5d2978a" Dec 05 21:09:26 crc kubenswrapper[4747]: I1205 21:09:26.299663 4747 scope.go:117] "RemoveContainer" containerID="4a0a187e3515674049916fea84c37dc12953c3a256a7c6a965d519c2997ec515" Dec 05 21:09:36 crc kubenswrapper[4747]: I1205 21:09:36.839942 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:09:36 crc kubenswrapper[4747]: E1205 21:09:36.840705 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:09:48 crc kubenswrapper[4747]: I1205 21:09:48.839645 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:09:48 crc kubenswrapper[4747]: E1205 21:09:48.840685 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:10:03 crc kubenswrapper[4747]: I1205 21:10:03.839947 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:10:03 crc kubenswrapper[4747]: E1205 21:10:03.841025 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:10:15 crc kubenswrapper[4747]: I1205 21:10:15.840723 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:10:15 crc kubenswrapper[4747]: E1205 21:10:15.841839 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:10:26 crc kubenswrapper[4747]: I1205 21:10:26.469794 4747 scope.go:117] "RemoveContainer" containerID="2be5a45020495919c8df1a96894165864e0b5d600e6abb5eb0ba83d8570f477b" Dec 05 21:10:26 crc kubenswrapper[4747]: I1205 21:10:26.546573 4747 scope.go:117] "RemoveContainer" containerID="f2f08789b47d9395ef4e7b2716ddbb8bc89f26728c733a0c9efe56c15ad7b799" Dec 05 21:10:30 crc kubenswrapper[4747]: I1205 21:10:30.840129 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:10:30 crc kubenswrapper[4747]: E1205 21:10:30.840701 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:10:42 crc kubenswrapper[4747]: I1205 21:10:42.839279 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:10:42 crc kubenswrapper[4747]: E1205 21:10:42.840955 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:10:55 crc kubenswrapper[4747]: I1205 21:10:55.840146 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:10:55 crc kubenswrapper[4747]: E1205 21:10:55.841195 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:11:08 crc kubenswrapper[4747]: I1205 21:11:08.840541 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:11:08 crc kubenswrapper[4747]: E1205 21:11:08.841322 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:11:21 crc kubenswrapper[4747]: I1205 21:11:21.840295 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:11:21 crc kubenswrapper[4747]: E1205 21:11:21.841023 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:11:26 crc kubenswrapper[4747]: I1205 21:11:26.605110 4747 scope.go:117] "RemoveContainer" containerID="6359e7cae45a7f65faa12fd0c5d2077e88173e760b5b42b8279224e5571f8ba5" Dec 05 21:11:26 crc kubenswrapper[4747]: I1205 21:11:26.636849 4747 scope.go:117] "RemoveContainer" containerID="bc88be3138ad806a1fad25c0703151fdaa867a84929b19340a4803a430b8c8b6" Dec 05 21:11:26 crc kubenswrapper[4747]: I1205 21:11:26.658024 4747 scope.go:117] "RemoveContainer" containerID="3ffa54ae9df9dadb9e8a91bcd057746909c3a6d881ece4200e599099ef8599db" Dec 05 21:11:26 crc kubenswrapper[4747]: I1205 21:11:26.685025 4747 scope.go:117] "RemoveContainer" containerID="f26116136a982eef2f31e0c94046144a04ef07ee2a4e9f404d513de905d257da" Dec 05 21:11:35 crc kubenswrapper[4747]: I1205 21:11:35.840056 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:11:35 crc kubenswrapper[4747]: E1205 21:11:35.840958 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:11:47 crc kubenswrapper[4747]: I1205 21:11:47.842081 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:11:47 crc kubenswrapper[4747]: E1205 21:11:47.844323 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:11:58 crc kubenswrapper[4747]: I1205 21:11:58.839628 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:11:58 crc kubenswrapper[4747]: E1205 21:11:58.840512 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:12:09 crc kubenswrapper[4747]: I1205 21:12:09.849855 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:12:09 crc kubenswrapper[4747]: E1205 21:12:09.850893 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:12:21 crc kubenswrapper[4747]: I1205 21:12:21.839560 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:12:21 crc kubenswrapper[4747]: E1205 21:12:21.840728 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:12:32 crc kubenswrapper[4747]: I1205 21:12:32.840165 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:12:32 crc kubenswrapper[4747]: E1205 21:12:32.841305 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:12:45 crc kubenswrapper[4747]: I1205 21:12:45.840949 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:12:45 crc kubenswrapper[4747]: E1205 21:12:45.842211 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:13:00 crc kubenswrapper[4747]: I1205 21:13:00.840117 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:13:00 crc kubenswrapper[4747]: E1205 21:13:00.841406 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:13:15 crc kubenswrapper[4747]: I1205 21:13:15.839787 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:13:16 crc kubenswrapper[4747]: I1205 21:13:16.440649 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613"} Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.568314 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:39 crc kubenswrapper[4747]: E1205 21:13:39.569653 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="extract-content" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.569684 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="extract-content" Dec 05 21:13:39 crc kubenswrapper[4747]: E1205 21:13:39.569726 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="extract-utilities" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.569749 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="extract-utilities" Dec 05 21:13:39 crc kubenswrapper[4747]: E1205 21:13:39.569787 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="registry-server" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.569804 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="registry-server" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.570153 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad540f7b-11dc-484c-af97-176be9046540" containerName="registry-server" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.574972 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.589316 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.671390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gjf\" (UniqueName: \"kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.671440 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.671547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.772547 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gjf\" (UniqueName: \"kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.772619 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.772689 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.773197 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.773226 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.801825 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gjf\" (UniqueName: \"kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf\") pod \"certified-operators-fs8f9\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:39 crc kubenswrapper[4747]: I1205 21:13:39.914963 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:40 crc kubenswrapper[4747]: I1205 21:13:40.377497 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:40 crc kubenswrapper[4747]: I1205 21:13:40.640469 4747 generic.go:334] "Generic (PLEG): container finished" podID="028939fc-a512-441e-93df-d287c057745d" containerID="c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6" exitCode=0 Dec 05 21:13:40 crc kubenswrapper[4747]: I1205 21:13:40.640574 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerDied","Data":"c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6"} Dec 05 21:13:40 crc kubenswrapper[4747]: I1205 21:13:40.640817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerStarted","Data":"cc313bf8011bb2e6a0bff20286d54a39d9a95bb576d69f879c0a593df79152a1"} Dec 05 21:13:40 crc kubenswrapper[4747]: I1205 21:13:40.643013 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:13:41 crc kubenswrapper[4747]: I1205 21:13:41.652025 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerStarted","Data":"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c"} Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.150298 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.151778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.187422 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.212454 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.212551 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.212610 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kp7x\" (UniqueName: \"kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.314287 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.314371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.314408 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kp7x\" (UniqueName: \"kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.314827 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.315096 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.339698 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kp7x\" (UniqueName: \"kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x\") pod \"redhat-marketplace-pbk7v\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.497363 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.662699 4747 generic.go:334] "Generic (PLEG): container finished" podID="028939fc-a512-441e-93df-d287c057745d" containerID="b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c" exitCode=0 Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.663805 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerDied","Data":"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c"} Dec 05 21:13:42 crc kubenswrapper[4747]: I1205 21:13:42.708061 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:43 crc kubenswrapper[4747]: I1205 21:13:43.672057 4747 generic.go:334] "Generic (PLEG): container finished" podID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerID="2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf" exitCode=0 Dec 05 21:13:43 crc kubenswrapper[4747]: I1205 21:13:43.672105 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerDied","Data":"2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf"} Dec 05 21:13:43 crc kubenswrapper[4747]: I1205 21:13:43.672450 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerStarted","Data":"47232c8f11d75c1d47a6afb837ef617e6e88fc43b33006045ff30184fbeac63a"} Dec 05 21:13:43 crc kubenswrapper[4747]: I1205 21:13:43.674870 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerStarted","Data":"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd"} Dec 05 21:13:43 crc kubenswrapper[4747]: I1205 21:13:43.715831 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fs8f9" podStartSLOduration=2.283895722 podStartE2EDuration="4.715802974s" podCreationTimestamp="2025-12-05 21:13:39 +0000 UTC" firstStartedPulling="2025-12-05 21:13:40.642818948 +0000 UTC m=+1891.110126436" lastFinishedPulling="2025-12-05 21:13:43.0747262 +0000 UTC m=+1893.542033688" observedRunningTime="2025-12-05 21:13:43.70569589 +0000 UTC m=+1894.173003408" watchObservedRunningTime="2025-12-05 21:13:43.715802974 +0000 UTC m=+1894.183110492" Dec 05 21:13:44 crc kubenswrapper[4747]: I1205 21:13:44.685098 4747 generic.go:334] "Generic (PLEG): container finished" podID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerID="93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5" exitCode=0 Dec 05 21:13:44 crc kubenswrapper[4747]: I1205 21:13:44.685154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerDied","Data":"93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5"} Dec 05 21:13:45 crc kubenswrapper[4747]: I1205 21:13:45.695628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerStarted","Data":"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4"} Dec 05 21:13:45 crc kubenswrapper[4747]: I1205 21:13:45.717539 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pbk7v" podStartSLOduration=2.246397767 podStartE2EDuration="3.717520479s" podCreationTimestamp="2025-12-05 21:13:42 +0000 UTC" firstStartedPulling="2025-12-05 21:13:43.673942331 +0000 UTC m=+1894.141249819" lastFinishedPulling="2025-12-05 21:13:45.145065043 +0000 UTC m=+1895.612372531" observedRunningTime="2025-12-05 21:13:45.715064528 +0000 UTC m=+1896.182372036" watchObservedRunningTime="2025-12-05 21:13:45.717520479 +0000 UTC m=+1896.184827987" Dec 05 21:13:49 crc kubenswrapper[4747]: I1205 21:13:49.916344 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:49 crc kubenswrapper[4747]: I1205 21:13:49.916928 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:49 crc kubenswrapper[4747]: I1205 21:13:49.974392 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:50 crc kubenswrapper[4747]: I1205 21:13:50.794525 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:50 crc kubenswrapper[4747]: I1205 21:13:50.847326 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:52 crc kubenswrapper[4747]: I1205 21:13:52.498309 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:52 crc kubenswrapper[4747]: I1205 21:13:52.498938 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:52 crc kubenswrapper[4747]: I1205 21:13:52.537715 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:52 crc kubenswrapper[4747]: I1205 21:13:52.764431 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fs8f9" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="registry-server" containerID="cri-o://dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd" gracePeriod=2 Dec 05 21:13:52 crc kubenswrapper[4747]: I1205 21:13:52.820181 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.615281 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.694271 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.776599 4747 generic.go:334] "Generic (PLEG): container finished" podID="028939fc-a512-441e-93df-d287c057745d" containerID="dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd" exitCode=0 Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.776719 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs8f9" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.776700 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerDied","Data":"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd"} Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.777273 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs8f9" event={"ID":"028939fc-a512-441e-93df-d287c057745d","Type":"ContainerDied","Data":"cc313bf8011bb2e6a0bff20286d54a39d9a95bb576d69f879c0a593df79152a1"} Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.777330 4747 scope.go:117] "RemoveContainer" containerID="dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.797387 4747 scope.go:117] "RemoveContainer" containerID="b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.816025 4747 scope.go:117] "RemoveContainer" containerID="c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.842689 4747 scope.go:117] "RemoveContainer" containerID="dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd" Dec 05 21:13:53 crc kubenswrapper[4747]: E1205 21:13:53.843367 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd\": container with ID starting with dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd not found: ID does not exist" containerID="dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.843410 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd"} err="failed to get container status \"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd\": rpc error: code = NotFound desc = could not find container \"dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd\": container with ID starting with dd38f520513a3aa7736aeefc7c8363060ed1aa255d13387674427dfe5cb8ecbd not found: ID does not exist" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.843434 4747 scope.go:117] "RemoveContainer" containerID="b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c" Dec 05 21:13:53 crc kubenswrapper[4747]: E1205 21:13:53.843762 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c\": container with ID starting with b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c not found: ID does not exist" containerID="b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.843805 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c"} err="failed to get container status \"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c\": rpc error: code = NotFound desc = could not find container \"b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c\": container with ID starting with b964ee6e212524b91788068e7ee1d42611e7445be6bf72abfb1f5f713609067c not found: ID does not exist" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.843839 4747 scope.go:117] "RemoveContainer" containerID="c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6" Dec 05 21:13:53 crc kubenswrapper[4747]: E1205 21:13:53.844083 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6\": container with ID starting with c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6 not found: ID does not exist" containerID="c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.844113 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6"} err="failed to get container status \"c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6\": rpc error: code = NotFound desc = could not find container \"c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6\": container with ID starting with c50ed416ed65db2d9ef4edd8f9f1b36e6764f04fbe17993552f7a84643fef7e6 not found: ID does not exist" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.889105 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content\") pod \"028939fc-a512-441e-93df-d287c057745d\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.889342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8gjf\" (UniqueName: \"kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf\") pod \"028939fc-a512-441e-93df-d287c057745d\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.889425 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities\") pod \"028939fc-a512-441e-93df-d287c057745d\" (UID: \"028939fc-a512-441e-93df-d287c057745d\") " Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.890393 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities" (OuterVolumeSpecName: "utilities") pod "028939fc-a512-441e-93df-d287c057745d" (UID: "028939fc-a512-441e-93df-d287c057745d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.894946 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf" (OuterVolumeSpecName: "kube-api-access-j8gjf") pod "028939fc-a512-441e-93df-d287c057745d" (UID: "028939fc-a512-441e-93df-d287c057745d"). InnerVolumeSpecName "kube-api-access-j8gjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.960126 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "028939fc-a512-441e-93df-d287c057745d" (UID: "028939fc-a512-441e-93df-d287c057745d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.990806 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.990842 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8gjf\" (UniqueName: \"kubernetes.io/projected/028939fc-a512-441e-93df-d287c057745d-kube-api-access-j8gjf\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:53 crc kubenswrapper[4747]: I1205 21:13:53.990853 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028939fc-a512-441e-93df-d287c057745d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:54 crc kubenswrapper[4747]: I1205 21:13:54.115712 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:54 crc kubenswrapper[4747]: I1205 21:13:54.123113 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fs8f9"] Dec 05 21:13:54 crc kubenswrapper[4747]: I1205 21:13:54.791479 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pbk7v" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="registry-server" containerID="cri-o://29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4" gracePeriod=2 Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.165626 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.214176 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content\") pod \"49a5f379-bf43-4222-86d8-fe6517c6666c\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.214796 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kp7x\" (UniqueName: \"kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x\") pod \"49a5f379-bf43-4222-86d8-fe6517c6666c\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.215498 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities\") pod \"49a5f379-bf43-4222-86d8-fe6517c6666c\" (UID: \"49a5f379-bf43-4222-86d8-fe6517c6666c\") " Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.217345 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities" (OuterVolumeSpecName: "utilities") pod "49a5f379-bf43-4222-86d8-fe6517c6666c" (UID: "49a5f379-bf43-4222-86d8-fe6517c6666c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.220904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x" (OuterVolumeSpecName: "kube-api-access-9kp7x") pod "49a5f379-bf43-4222-86d8-fe6517c6666c" (UID: "49a5f379-bf43-4222-86d8-fe6517c6666c"). InnerVolumeSpecName "kube-api-access-9kp7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.237469 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a5f379-bf43-4222-86d8-fe6517c6666c" (UID: "49a5f379-bf43-4222-86d8-fe6517c6666c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.317131 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kp7x\" (UniqueName: \"kubernetes.io/projected/49a5f379-bf43-4222-86d8-fe6517c6666c-kube-api-access-9kp7x\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.317159 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.317170 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a5f379-bf43-4222-86d8-fe6517c6666c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.804060 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbk7v" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.804076 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerDied","Data":"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4"} Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.806016 4747 scope.go:117] "RemoveContainer" containerID="29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.803917 4747 generic.go:334] "Generic (PLEG): container finished" podID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerID="29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4" exitCode=0 Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.807991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbk7v" event={"ID":"49a5f379-bf43-4222-86d8-fe6517c6666c","Type":"ContainerDied","Data":"47232c8f11d75c1d47a6afb837ef617e6e88fc43b33006045ff30184fbeac63a"} Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.833867 4747 scope.go:117] "RemoveContainer" containerID="93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.851188 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028939fc-a512-441e-93df-d287c057745d" path="/var/lib/kubelet/pods/028939fc-a512-441e-93df-d287c057745d/volumes" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.865282 4747 scope.go:117] "RemoveContainer" containerID="2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.871823 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.879626 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbk7v"] Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.922877 4747 scope.go:117] "RemoveContainer" containerID="29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4" Dec 05 21:13:55 crc kubenswrapper[4747]: E1205 21:13:55.923275 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4\": container with ID starting with 29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4 not found: ID does not exist" containerID="29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.923328 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4"} err="failed to get container status \"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4\": rpc error: code = NotFound desc = could not find container \"29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4\": container with ID starting with 29131c10ee3c05d0774d15548702bbee62943333a1afe67fbc36413e86c47ea4 not found: ID does not exist" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.923352 4747 scope.go:117] "RemoveContainer" containerID="93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5" Dec 05 21:13:55 crc kubenswrapper[4747]: E1205 21:13:55.923810 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5\": container with ID starting with 93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5 not found: ID does not exist" containerID="93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.923843 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5"} err="failed to get container status \"93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5\": rpc error: code = NotFound desc = could not find container \"93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5\": container with ID starting with 93fce572988e2f7fb563cffb722b8340cf38c2e7c93e07886d1cd70a75ab1bd5 not found: ID does not exist" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.923860 4747 scope.go:117] "RemoveContainer" containerID="2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf" Dec 05 21:13:55 crc kubenswrapper[4747]: E1205 21:13:55.924208 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf\": container with ID starting with 2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf not found: ID does not exist" containerID="2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf" Dec 05 21:13:55 crc kubenswrapper[4747]: I1205 21:13:55.924237 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf"} err="failed to get container status \"2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf\": rpc error: code = NotFound desc = could not find container \"2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf\": container with ID starting with 2bb940365b8d33fff7a3ccacbba286cdf8b580971d5863b2adc930e0bfbcc9cf not found: ID does not exist" Dec 05 21:13:57 crc kubenswrapper[4747]: I1205 21:13:57.851335 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" path="/var/lib/kubelet/pods/49a5f379-bf43-4222-86d8-fe6517c6666c/volumes" Dec 05 21:14:26 crc kubenswrapper[4747]: I1205 21:14:26.807888 4747 scope.go:117] "RemoveContainer" containerID="ee1ab4798a7083d057cb05ddd6ed4c7d76f0ddbaff58c30fe9b77bb32e0e185f" Dec 05 21:14:26 crc kubenswrapper[4747]: I1205 21:14:26.837516 4747 scope.go:117] "RemoveContainer" containerID="11a880306f5503b5ee78907947a91dd47d41aaf2b44c710e06905f15f1c25de4" Dec 05 21:14:26 crc kubenswrapper[4747]: I1205 21:14:26.868372 4747 scope.go:117] "RemoveContainer" containerID="7bec91f5b3abb55830e96d1d3ea7a7bf8ed18b57d70ba11f5b46c49c4c8de532" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.163473 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8"] Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166395 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="extract-content" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166432 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="extract-content" Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166460 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="extract-utilities" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166472 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="extract-utilities" Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166513 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166524 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166542 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="extract-content" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166552 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="extract-content" Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166564 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166573 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: E1205 21:15:00.166613 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="extract-utilities" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166624 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="extract-utilities" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166850 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="028939fc-a512-441e-93df-d287c057745d" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.166874 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a5f379-bf43-4222-86d8-fe6517c6666c" containerName="registry-server" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.167854 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.170016 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.170211 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.172662 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8"] Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.284320 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v79l\" (UniqueName: \"kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.284427 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.284464 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.386239 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v79l\" (UniqueName: \"kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.386288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.386307 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.387254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.396157 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.407842 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v79l\" (UniqueName: \"kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l\") pod \"collect-profiles-29416155-j69p8\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.495977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:00 crc kubenswrapper[4747]: I1205 21:15:00.980248 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8"] Dec 05 21:15:01 crc kubenswrapper[4747]: I1205 21:15:01.389607 4747 generic.go:334] "Generic (PLEG): container finished" podID="1dbe0335-bf8b-42a9-9bbc-271a8b384552" containerID="a2ce5a2160284096c01d20b843db02a5d0abc4c2245f0a93699eb8e2926e7ca6" exitCode=0 Dec 05 21:15:01 crc kubenswrapper[4747]: I1205 21:15:01.389670 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" event={"ID":"1dbe0335-bf8b-42a9-9bbc-271a8b384552","Type":"ContainerDied","Data":"a2ce5a2160284096c01d20b843db02a5d0abc4c2245f0a93699eb8e2926e7ca6"} Dec 05 21:15:01 crc kubenswrapper[4747]: I1205 21:15:01.389963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" event={"ID":"1dbe0335-bf8b-42a9-9bbc-271a8b384552","Type":"ContainerStarted","Data":"33e6d6114479911d61203599197097e6f73c440670fdd8f918a9ee241af062db"} Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.722235 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.822941 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v79l\" (UniqueName: \"kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l\") pod \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.823036 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume\") pod \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.823074 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume\") pod \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\" (UID: \"1dbe0335-bf8b-42a9-9bbc-271a8b384552\") " Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.823874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume" (OuterVolumeSpecName: "config-volume") pod "1dbe0335-bf8b-42a9-9bbc-271a8b384552" (UID: "1dbe0335-bf8b-42a9-9bbc-271a8b384552"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.827627 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l" (OuterVolumeSpecName: "kube-api-access-4v79l") pod "1dbe0335-bf8b-42a9-9bbc-271a8b384552" (UID: "1dbe0335-bf8b-42a9-9bbc-271a8b384552"). InnerVolumeSpecName "kube-api-access-4v79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.830417 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1dbe0335-bf8b-42a9-9bbc-271a8b384552" (UID: "1dbe0335-bf8b-42a9-9bbc-271a8b384552"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.924337 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v79l\" (UniqueName: \"kubernetes.io/projected/1dbe0335-bf8b-42a9-9bbc-271a8b384552-kube-api-access-4v79l\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.924386 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbe0335-bf8b-42a9-9bbc-271a8b384552-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:02 crc kubenswrapper[4747]: I1205 21:15:02.924426 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dbe0335-bf8b-42a9-9bbc-271a8b384552-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.406290 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" event={"ID":"1dbe0335-bf8b-42a9-9bbc-271a8b384552","Type":"ContainerDied","Data":"33e6d6114479911d61203599197097e6f73c440670fdd8f918a9ee241af062db"} Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.406627 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e6d6114479911d61203599197097e6f73c440670fdd8f918a9ee241af062db" Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.406355 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8" Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.811858 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr"] Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.817496 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-l4svr"] Dec 05 21:15:03 crc kubenswrapper[4747]: I1205 21:15:03.866095 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="283b35ea-91c4-4185-a639-9a8a1a80aaa7" path="/var/lib/kubelet/pods/283b35ea-91c4-4185-a639-9a8a1a80aaa7/volumes" Dec 05 21:15:26 crc kubenswrapper[4747]: I1205 21:15:26.960622 4747 scope.go:117] "RemoveContainer" containerID="1856838e2e010e8e7a549fc5049197714e46ca623a2063c93f8fb7229536ea4e" Dec 05 21:15:36 crc kubenswrapper[4747]: I1205 21:15:36.222175 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:15:36 crc kubenswrapper[4747]: I1205 21:15:36.223779 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:16:06 crc kubenswrapper[4747]: I1205 21:16:06.222572 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:16:06 crc kubenswrapper[4747]: I1205 21:16:06.223208 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:16:36 crc kubenswrapper[4747]: I1205 21:16:36.222557 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:16:36 crc kubenswrapper[4747]: I1205 21:16:36.223229 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:16:36 crc kubenswrapper[4747]: I1205 21:16:36.223286 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:16:36 crc kubenswrapper[4747]: I1205 21:16:36.223942 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:16:36 crc kubenswrapper[4747]: I1205 21:16:36.223997 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613" gracePeriod=600 Dec 05 21:16:37 crc kubenswrapper[4747]: I1205 21:16:37.258425 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613" exitCode=0 Dec 05 21:16:37 crc kubenswrapper[4747]: I1205 21:16:37.258461 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613"} Dec 05 21:16:37 crc kubenswrapper[4747]: I1205 21:16:37.259490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42"} Dec 05 21:16:37 crc kubenswrapper[4747]: I1205 21:16:37.259517 4747 scope.go:117] "RemoveContainer" containerID="85e654861db031ed97f437f4fd7fbfcf9323505eb7e06f0c537752450cafc9c1" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.470335 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:16:46 crc kubenswrapper[4747]: E1205 21:16:46.471290 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbe0335-bf8b-42a9-9bbc-271a8b384552" containerName="collect-profiles" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.471308 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbe0335-bf8b-42a9-9bbc-271a8b384552" containerName="collect-profiles" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.471480 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbe0335-bf8b-42a9-9bbc-271a8b384552" containerName="collect-profiles" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.473034 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.483606 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.575038 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckrv\" (UniqueName: \"kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.575388 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.575600 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.677238 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckrv\" (UniqueName: \"kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.677289 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.677316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.677879 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.677970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.706348 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckrv\" (UniqueName: \"kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv\") pod \"redhat-operators-rlwjp\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:46 crc kubenswrapper[4747]: I1205 21:16:46.796222 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:47 crc kubenswrapper[4747]: I1205 21:16:47.197208 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:16:47 crc kubenswrapper[4747]: W1205 21:16:47.202974 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f143af5_3dee_4533_807a_76ce59612587.slice/crio-dbf2966b2ce1a5d83c7301914a3fe20d5dc12ad775d7084ccd54209aba3d2f63 WatchSource:0}: Error finding container dbf2966b2ce1a5d83c7301914a3fe20d5dc12ad775d7084ccd54209aba3d2f63: Status 404 returned error can't find the container with id dbf2966b2ce1a5d83c7301914a3fe20d5dc12ad775d7084ccd54209aba3d2f63 Dec 05 21:16:47 crc kubenswrapper[4747]: I1205 21:16:47.342402 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerStarted","Data":"dbf2966b2ce1a5d83c7301914a3fe20d5dc12ad775d7084ccd54209aba3d2f63"} Dec 05 21:16:48 crc kubenswrapper[4747]: I1205 21:16:48.353659 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f143af5-3dee-4533-807a-76ce59612587" containerID="2337920fc34dd65fd0db17f53a7095a1d40b55c50e9db10c98f580c35a951430" exitCode=0 Dec 05 21:16:48 crc kubenswrapper[4747]: I1205 21:16:48.353719 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerDied","Data":"2337920fc34dd65fd0db17f53a7095a1d40b55c50e9db10c98f580c35a951430"} Dec 05 21:16:49 crc kubenswrapper[4747]: I1205 21:16:49.362540 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerStarted","Data":"ab469106d3319c88f4ad6d643b067c6a3778e7dcf43ca68f8f5cbc89a1a6f7f8"} Dec 05 21:16:50 crc kubenswrapper[4747]: I1205 21:16:50.380497 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f143af5-3dee-4533-807a-76ce59612587" containerID="ab469106d3319c88f4ad6d643b067c6a3778e7dcf43ca68f8f5cbc89a1a6f7f8" exitCode=0 Dec 05 21:16:50 crc kubenswrapper[4747]: I1205 21:16:50.380623 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerDied","Data":"ab469106d3319c88f4ad6d643b067c6a3778e7dcf43ca68f8f5cbc89a1a6f7f8"} Dec 05 21:16:51 crc kubenswrapper[4747]: I1205 21:16:51.391308 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerStarted","Data":"a22ab61b5835c8edd9abfb79766f207c4b9da44041cc8b585a300f8345993048"} Dec 05 21:16:51 crc kubenswrapper[4747]: I1205 21:16:51.410174 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rlwjp" podStartSLOduration=2.974209101 podStartE2EDuration="5.410158682s" podCreationTimestamp="2025-12-05 21:16:46 +0000 UTC" firstStartedPulling="2025-12-05 21:16:48.35581062 +0000 UTC m=+2078.823118118" lastFinishedPulling="2025-12-05 21:16:50.791760211 +0000 UTC m=+2081.259067699" observedRunningTime="2025-12-05 21:16:51.408636945 +0000 UTC m=+2081.875944433" watchObservedRunningTime="2025-12-05 21:16:51.410158682 +0000 UTC m=+2081.877466170" Dec 05 21:16:56 crc kubenswrapper[4747]: I1205 21:16:56.796703 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:56 crc kubenswrapper[4747]: I1205 21:16:56.797883 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:56 crc kubenswrapper[4747]: I1205 21:16:56.875166 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:57 crc kubenswrapper[4747]: I1205 21:16:57.470071 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:16:57 crc kubenswrapper[4747]: I1205 21:16:57.524140 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:16:59 crc kubenswrapper[4747]: I1205 21:16:59.443965 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rlwjp" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="registry-server" containerID="cri-o://a22ab61b5835c8edd9abfb79766f207c4b9da44041cc8b585a300f8345993048" gracePeriod=2 Dec 05 21:17:01 crc kubenswrapper[4747]: I1205 21:17:01.464267 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f143af5-3dee-4533-807a-76ce59612587" containerID="a22ab61b5835c8edd9abfb79766f207c4b9da44041cc8b585a300f8345993048" exitCode=0 Dec 05 21:17:01 crc kubenswrapper[4747]: I1205 21:17:01.464338 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerDied","Data":"a22ab61b5835c8edd9abfb79766f207c4b9da44041cc8b585a300f8345993048"} Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.538806 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.610871 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckrv\" (UniqueName: \"kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv\") pod \"5f143af5-3dee-4533-807a-76ce59612587\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.610931 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities\") pod \"5f143af5-3dee-4533-807a-76ce59612587\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.611025 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content\") pod \"5f143af5-3dee-4533-807a-76ce59612587\" (UID: \"5f143af5-3dee-4533-807a-76ce59612587\") " Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.611947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities" (OuterVolumeSpecName: "utilities") pod "5f143af5-3dee-4533-807a-76ce59612587" (UID: "5f143af5-3dee-4533-807a-76ce59612587"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.616763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv" (OuterVolumeSpecName: "kube-api-access-mckrv") pod "5f143af5-3dee-4533-807a-76ce59612587" (UID: "5f143af5-3dee-4533-807a-76ce59612587"). InnerVolumeSpecName "kube-api-access-mckrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.712400 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckrv\" (UniqueName: \"kubernetes.io/projected/5f143af5-3dee-4533-807a-76ce59612587-kube-api-access-mckrv\") on node \"crc\" DevicePath \"\"" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.712434 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.717879 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f143af5-3dee-4533-807a-76ce59612587" (UID: "5f143af5-3dee-4533-807a-76ce59612587"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:17:02 crc kubenswrapper[4747]: I1205 21:17:02.814219 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f143af5-3dee-4533-807a-76ce59612587-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.483869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlwjp" event={"ID":"5f143af5-3dee-4533-807a-76ce59612587","Type":"ContainerDied","Data":"dbf2966b2ce1a5d83c7301914a3fe20d5dc12ad775d7084ccd54209aba3d2f63"} Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.484165 4747 scope.go:117] "RemoveContainer" containerID="a22ab61b5835c8edd9abfb79766f207c4b9da44041cc8b585a300f8345993048" Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.483928 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlwjp" Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.515037 4747 scope.go:117] "RemoveContainer" containerID="ab469106d3319c88f4ad6d643b067c6a3778e7dcf43ca68f8f5cbc89a1a6f7f8" Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.520246 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.545049 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rlwjp"] Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.549903 4747 scope.go:117] "RemoveContainer" containerID="2337920fc34dd65fd0db17f53a7095a1d40b55c50e9db10c98f580c35a951430" Dec 05 21:17:03 crc kubenswrapper[4747]: I1205 21:17:03.852404 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f143af5-3dee-4533-807a-76ce59612587" path="/var/lib/kubelet/pods/5f143af5-3dee-4533-807a-76ce59612587/volumes" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.053845 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:17:52 crc kubenswrapper[4747]: E1205 21:17:52.054947 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="registry-server" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.054970 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="registry-server" Dec 05 21:17:52 crc kubenswrapper[4747]: E1205 21:17:52.054994 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="extract-utilities" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.055004 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="extract-utilities" Dec 05 21:17:52 crc kubenswrapper[4747]: E1205 21:17:52.055030 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="extract-content" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.055039 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="extract-content" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.055284 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f143af5-3dee-4533-807a-76ce59612587" containerName="registry-server" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.057074 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.076888 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.241617 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.242083 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h968p\" (UniqueName: \"kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.242135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.343898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.343941 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h968p\" (UniqueName: \"kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.343970 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.344456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.345189 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.376683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h968p\" (UniqueName: \"kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p\") pod \"community-operators-b92mp\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.414943 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:17:52 crc kubenswrapper[4747]: W1205 21:17:52.941693 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868fe493_08db_4b73_bf29_0b6864aaa2bb.slice/crio-a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5 WatchSource:0}: Error finding container a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5: Status 404 returned error can't find the container with id a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5 Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.945427 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:17:52 crc kubenswrapper[4747]: I1205 21:17:52.959718 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerStarted","Data":"a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5"} Dec 05 21:17:53 crc kubenswrapper[4747]: I1205 21:17:53.973827 4747 generic.go:334] "Generic (PLEG): container finished" podID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerID="a165633aa6456b14fec0d42184c588f84ab474ac7c5d77e8aa195f142593be35" exitCode=0 Dec 05 21:17:53 crc kubenswrapper[4747]: I1205 21:17:53.974013 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerDied","Data":"a165633aa6456b14fec0d42184c588f84ab474ac7c5d77e8aa195f142593be35"} Dec 05 21:17:55 crc kubenswrapper[4747]: I1205 21:17:55.991968 4747 generic.go:334] "Generic (PLEG): container finished" podID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerID="a26465194f32b48063bce0c301b45e8bfad5953c5f717c019886e7723732eb91" exitCode=0 Dec 05 21:17:55 crc kubenswrapper[4747]: I1205 21:17:55.992107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerDied","Data":"a26465194f32b48063bce0c301b45e8bfad5953c5f717c019886e7723732eb91"} Dec 05 21:17:57 crc kubenswrapper[4747]: I1205 21:17:57.004831 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerStarted","Data":"cce913f11759f499b310fe12d6f7a20cc9d96e9c2689ab4a5b7b6c4b482d0810"} Dec 05 21:17:57 crc kubenswrapper[4747]: I1205 21:17:57.028022 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b92mp" podStartSLOduration=2.5066613 podStartE2EDuration="5.028000064s" podCreationTimestamp="2025-12-05 21:17:52 +0000 UTC" firstStartedPulling="2025-12-05 21:17:53.976475591 +0000 UTC m=+2144.443783089" lastFinishedPulling="2025-12-05 21:17:56.497814325 +0000 UTC m=+2146.965121853" observedRunningTime="2025-12-05 21:17:57.022248673 +0000 UTC m=+2147.489556171" watchObservedRunningTime="2025-12-05 21:17:57.028000064 +0000 UTC m=+2147.495307572" Dec 05 21:18:02 crc kubenswrapper[4747]: I1205 21:18:02.415790 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:02 crc kubenswrapper[4747]: I1205 21:18:02.416096 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:02 crc kubenswrapper[4747]: I1205 21:18:02.468149 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:03 crc kubenswrapper[4747]: I1205 21:18:03.097355 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:03 crc kubenswrapper[4747]: I1205 21:18:03.142733 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:18:05 crc kubenswrapper[4747]: I1205 21:18:05.065087 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b92mp" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="registry-server" containerID="cri-o://cce913f11759f499b310fe12d6f7a20cc9d96e9c2689ab4a5b7b6c4b482d0810" gracePeriod=2 Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.076523 4747 generic.go:334] "Generic (PLEG): container finished" podID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerID="cce913f11759f499b310fe12d6f7a20cc9d96e9c2689ab4a5b7b6c4b482d0810" exitCode=0 Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.076898 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerDied","Data":"cce913f11759f499b310fe12d6f7a20cc9d96e9c2689ab4a5b7b6c4b482d0810"} Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.076999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b92mp" event={"ID":"868fe493-08db-4b73-bf29-0b6864aaa2bb","Type":"ContainerDied","Data":"a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5"} Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.077012 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a50278dae148535483efa001a0ad0451890b2b09eb710abe0a4cfc9d684987d5" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.102016 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.263365 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content\") pod \"868fe493-08db-4b73-bf29-0b6864aaa2bb\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.264244 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities\") pod \"868fe493-08db-4b73-bf29-0b6864aaa2bb\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.264325 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h968p\" (UniqueName: \"kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p\") pod \"868fe493-08db-4b73-bf29-0b6864aaa2bb\" (UID: \"868fe493-08db-4b73-bf29-0b6864aaa2bb\") " Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.265756 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities" (OuterVolumeSpecName: "utilities") pod "868fe493-08db-4b73-bf29-0b6864aaa2bb" (UID: "868fe493-08db-4b73-bf29-0b6864aaa2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.270717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p" (OuterVolumeSpecName: "kube-api-access-h968p") pod "868fe493-08db-4b73-bf29-0b6864aaa2bb" (UID: "868fe493-08db-4b73-bf29-0b6864aaa2bb"). InnerVolumeSpecName "kube-api-access-h968p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.322868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "868fe493-08db-4b73-bf29-0b6864aaa2bb" (UID: "868fe493-08db-4b73-bf29-0b6864aaa2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.365616 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h968p\" (UniqueName: \"kubernetes.io/projected/868fe493-08db-4b73-bf29-0b6864aaa2bb-kube-api-access-h968p\") on node \"crc\" DevicePath \"\"" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.365665 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:18:06 crc kubenswrapper[4747]: I1205 21:18:06.365679 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868fe493-08db-4b73-bf29-0b6864aaa2bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:18:07 crc kubenswrapper[4747]: I1205 21:18:07.084710 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b92mp" Dec 05 21:18:07 crc kubenswrapper[4747]: I1205 21:18:07.138932 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:18:07 crc kubenswrapper[4747]: I1205 21:18:07.150065 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b92mp"] Dec 05 21:18:07 crc kubenswrapper[4747]: I1205 21:18:07.850858 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" path="/var/lib/kubelet/pods/868fe493-08db-4b73-bf29-0b6864aaa2bb/volumes" Dec 05 21:18:36 crc kubenswrapper[4747]: I1205 21:18:36.222669 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:18:36 crc kubenswrapper[4747]: I1205 21:18:36.223502 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:19:06 crc kubenswrapper[4747]: I1205 21:19:06.221992 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:19:06 crc kubenswrapper[4747]: I1205 21:19:06.222735 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.221493 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.222263 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.222364 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.224091 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.224207 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" gracePeriod=600 Dec 05 21:19:36 crc kubenswrapper[4747]: E1205 21:19:36.356398 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.872912 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" exitCode=0 Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.872962 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42"} Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.873002 4747 scope.go:117] "RemoveContainer" containerID="aad20883fb9519014bf26b90acc7e97ef1f1dc743a4d4897b6ea932df56bd613" Dec 05 21:19:36 crc kubenswrapper[4747]: I1205 21:19:36.877122 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:19:36 crc kubenswrapper[4747]: E1205 21:19:36.877659 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:19:51 crc kubenswrapper[4747]: I1205 21:19:51.841218 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:19:51 crc kubenswrapper[4747]: E1205 21:19:51.842305 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:20:02 crc kubenswrapper[4747]: I1205 21:20:02.840231 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:20:02 crc kubenswrapper[4747]: E1205 21:20:02.841362 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:20:14 crc kubenswrapper[4747]: I1205 21:20:14.840361 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:20:14 crc kubenswrapper[4747]: E1205 21:20:14.841489 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:20:29 crc kubenswrapper[4747]: I1205 21:20:29.848970 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:20:29 crc kubenswrapper[4747]: E1205 21:20:29.851622 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:20:41 crc kubenswrapper[4747]: I1205 21:20:41.840807 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:20:41 crc kubenswrapper[4747]: E1205 21:20:41.842095 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:20:52 crc kubenswrapper[4747]: I1205 21:20:52.840774 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:20:52 crc kubenswrapper[4747]: E1205 21:20:52.841792 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:21:04 crc kubenswrapper[4747]: I1205 21:21:04.840355 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:21:04 crc kubenswrapper[4747]: E1205 21:21:04.841104 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:21:16 crc kubenswrapper[4747]: I1205 21:21:16.839442 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:21:16 crc kubenswrapper[4747]: E1205 21:21:16.840167 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:21:31 crc kubenswrapper[4747]: I1205 21:21:31.840380 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:21:31 crc kubenswrapper[4747]: E1205 21:21:31.841193 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:21:44 crc kubenswrapper[4747]: I1205 21:21:44.839874 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:21:44 crc kubenswrapper[4747]: E1205 21:21:44.840816 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:21:58 crc kubenswrapper[4747]: I1205 21:21:58.840949 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:21:58 crc kubenswrapper[4747]: E1205 21:21:58.842062 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:22:10 crc kubenswrapper[4747]: I1205 21:22:10.840113 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:22:10 crc kubenswrapper[4747]: E1205 21:22:10.840870 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:22:22 crc kubenswrapper[4747]: I1205 21:22:22.840470 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:22:22 crc kubenswrapper[4747]: E1205 21:22:22.841184 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:22:36 crc kubenswrapper[4747]: I1205 21:22:36.840220 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:22:36 crc kubenswrapper[4747]: E1205 21:22:36.842230 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:22:51 crc kubenswrapper[4747]: I1205 21:22:51.839805 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:22:51 crc kubenswrapper[4747]: E1205 21:22:51.841767 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:23:06 crc kubenswrapper[4747]: I1205 21:23:06.840987 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:23:06 crc kubenswrapper[4747]: E1205 21:23:06.842255 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:23:20 crc kubenswrapper[4747]: I1205 21:23:20.839443 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:23:20 crc kubenswrapper[4747]: E1205 21:23:20.841547 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:23:34 crc kubenswrapper[4747]: I1205 21:23:34.840284 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:23:34 crc kubenswrapper[4747]: E1205 21:23:34.841482 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:23:49 crc kubenswrapper[4747]: I1205 21:23:49.851058 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:23:49 crc kubenswrapper[4747]: E1205 21:23:49.852227 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:24:01 crc kubenswrapper[4747]: I1205 21:24:01.840359 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:24:01 crc kubenswrapper[4747]: E1205 21:24:01.841333 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:24:15 crc kubenswrapper[4747]: I1205 21:24:15.840898 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:24:15 crc kubenswrapper[4747]: E1205 21:24:15.842082 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.753901 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:20 crc kubenswrapper[4747]: E1205 21:24:20.756257 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="registry-server" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.756491 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="registry-server" Dec 05 21:24:20 crc kubenswrapper[4747]: E1205 21:24:20.756673 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="extract-utilities" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.756852 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="extract-utilities" Dec 05 21:24:20 crc kubenswrapper[4747]: E1205 21:24:20.757067 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="extract-content" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.757283 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="extract-content" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.757890 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="868fe493-08db-4b73-bf29-0b6864aaa2bb" containerName="registry-server" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.761514 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.775033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.822922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.823010 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdck\" (UniqueName: \"kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.823123 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.924957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.925072 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdck\" (UniqueName: \"kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.925134 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.925920 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.926217 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:20 crc kubenswrapper[4747]: I1205 21:24:20.951467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdck\" (UniqueName: \"kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck\") pod \"certified-operators-bxlt4\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:21 crc kubenswrapper[4747]: I1205 21:24:21.108876 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:21 crc kubenswrapper[4747]: I1205 21:24:21.576697 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:21 crc kubenswrapper[4747]: W1205 21:24:21.581204 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53c4a65_50b8_4ed4_a25f_1d08b8a677fb.slice/crio-3e3865bd369da8f86b10476fe69027fd21a1798bd6dcbe23d5418c485bdbcfc1 WatchSource:0}: Error finding container 3e3865bd369da8f86b10476fe69027fd21a1798bd6dcbe23d5418c485bdbcfc1: Status 404 returned error can't find the container with id 3e3865bd369da8f86b10476fe69027fd21a1798bd6dcbe23d5418c485bdbcfc1 Dec 05 21:24:22 crc kubenswrapper[4747]: I1205 21:24:22.384497 4747 generic.go:334] "Generic (PLEG): container finished" podID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerID="7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f" exitCode=0 Dec 05 21:24:22 crc kubenswrapper[4747]: I1205 21:24:22.384560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerDied","Data":"7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f"} Dec 05 21:24:22 crc kubenswrapper[4747]: I1205 21:24:22.384630 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerStarted","Data":"3e3865bd369da8f86b10476fe69027fd21a1798bd6dcbe23d5418c485bdbcfc1"} Dec 05 21:24:22 crc kubenswrapper[4747]: I1205 21:24:22.389060 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:24:23 crc kubenswrapper[4747]: I1205 21:24:23.394261 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerStarted","Data":"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e"} Dec 05 21:24:24 crc kubenswrapper[4747]: I1205 21:24:24.404297 4747 generic.go:334] "Generic (PLEG): container finished" podID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerID="11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e" exitCode=0 Dec 05 21:24:24 crc kubenswrapper[4747]: I1205 21:24:24.404371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerDied","Data":"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e"} Dec 05 21:24:25 crc kubenswrapper[4747]: I1205 21:24:25.417039 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerStarted","Data":"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6"} Dec 05 21:24:27 crc kubenswrapper[4747]: I1205 21:24:27.174960 4747 scope.go:117] "RemoveContainer" containerID="a26465194f32b48063bce0c301b45e8bfad5953c5f717c019886e7723732eb91" Dec 05 21:24:27 crc kubenswrapper[4747]: I1205 21:24:27.222429 4747 scope.go:117] "RemoveContainer" containerID="cce913f11759f499b310fe12d6f7a20cc9d96e9c2689ab4a5b7b6c4b482d0810" Dec 05 21:24:27 crc kubenswrapper[4747]: I1205 21:24:27.250240 4747 scope.go:117] "RemoveContainer" containerID="a165633aa6456b14fec0d42184c588f84ab474ac7c5d77e8aa195f142593be35" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.668897 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bxlt4" podStartSLOduration=7.245348768 podStartE2EDuration="9.668871455s" podCreationTimestamp="2025-12-05 21:24:20 +0000 UTC" firstStartedPulling="2025-12-05 21:24:22.387689891 +0000 UTC m=+2532.854997419" lastFinishedPulling="2025-12-05 21:24:24.811212568 +0000 UTC m=+2535.278520106" observedRunningTime="2025-12-05 21:24:25.448347341 +0000 UTC m=+2535.915654879" watchObservedRunningTime="2025-12-05 21:24:29.668871455 +0000 UTC m=+2540.136178983" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.670052 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.671913 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.693040 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.794678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.795069 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.795271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvzm\" (UniqueName: \"kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.844172 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:24:29 crc kubenswrapper[4747]: E1205 21:24:29.844419 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.896354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.896419 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.896489 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvzm\" (UniqueName: \"kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.896938 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.897164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:29 crc kubenswrapper[4747]: I1205 21:24:29.920689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvzm\" (UniqueName: \"kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm\") pod \"redhat-marketplace-9j2xq\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:30 crc kubenswrapper[4747]: I1205 21:24:30.012120 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:30 crc kubenswrapper[4747]: I1205 21:24:30.490866 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:30 crc kubenswrapper[4747]: I1205 21:24:30.596254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerStarted","Data":"d7b1f83d70a4ef31a61d0babedecfd231d9667893a630465b3098bb3bd087858"} Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.109555 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.110513 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.170449 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.609664 4747 generic.go:334] "Generic (PLEG): container finished" podID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerID="34d72eb1a55001da8e657d75b07e525324c18c5f5a71d1829fdd0033f9ab8c5d" exitCode=0 Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.609761 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerDied","Data":"34d72eb1a55001da8e657d75b07e525324c18c5f5a71d1829fdd0033f9ab8c5d"} Dec 05 21:24:31 crc kubenswrapper[4747]: I1205 21:24:31.670664 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:32 crc kubenswrapper[4747]: I1205 21:24:32.621847 4747 generic.go:334] "Generic (PLEG): container finished" podID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerID="90fdec136eb163b008d998f8e2e1476859bef9302486f2af30bc2df1eebd62b9" exitCode=0 Dec 05 21:24:32 crc kubenswrapper[4747]: I1205 21:24:32.622046 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerDied","Data":"90fdec136eb163b008d998f8e2e1476859bef9302486f2af30bc2df1eebd62b9"} Dec 05 21:24:33 crc kubenswrapper[4747]: I1205 21:24:33.439388 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:33 crc kubenswrapper[4747]: I1205 21:24:33.636722 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerStarted","Data":"1ad259ee91b5d23998b3558db8c701dfeaeb20a96e41a00d4c027082881a6831"} Dec 05 21:24:33 crc kubenswrapper[4747]: I1205 21:24:33.676737 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9j2xq" podStartSLOduration=3.268583878 podStartE2EDuration="4.676709178s" podCreationTimestamp="2025-12-05 21:24:29 +0000 UTC" firstStartedPulling="2025-12-05 21:24:31.614980576 +0000 UTC m=+2542.082288104" lastFinishedPulling="2025-12-05 21:24:33.023105866 +0000 UTC m=+2543.490413404" observedRunningTime="2025-12-05 21:24:33.666973576 +0000 UTC m=+2544.134281124" watchObservedRunningTime="2025-12-05 21:24:33.676709178 +0000 UTC m=+2544.144016706" Dec 05 21:24:34 crc kubenswrapper[4747]: I1205 21:24:34.645611 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bxlt4" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="registry-server" containerID="cri-o://e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6" gracePeriod=2 Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.632717 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.667860 4747 generic.go:334] "Generic (PLEG): container finished" podID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerID="e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6" exitCode=0 Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.667923 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerDied","Data":"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6"} Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.667949 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxlt4" event={"ID":"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb","Type":"ContainerDied","Data":"3e3865bd369da8f86b10476fe69027fd21a1798bd6dcbe23d5418c485bdbcfc1"} Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.667967 4747 scope.go:117] "RemoveContainer" containerID="e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.667989 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxlt4" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.705419 4747 scope.go:117] "RemoveContainer" containerID="11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.733645 4747 scope.go:117] "RemoveContainer" containerID="7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.761743 4747 scope.go:117] "RemoveContainer" containerID="e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6" Dec 05 21:24:35 crc kubenswrapper[4747]: E1205 21:24:35.762574 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6\": container with ID starting with e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6 not found: ID does not exist" containerID="e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.762659 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6"} err="failed to get container status \"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6\": rpc error: code = NotFound desc = could not find container \"e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6\": container with ID starting with e24b1c80b7841b09759ecba5dc52a08ce9b38167446d48d73537d5b51b04fce6 not found: ID does not exist" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.762692 4747 scope.go:117] "RemoveContainer" containerID="11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e" Dec 05 21:24:35 crc kubenswrapper[4747]: E1205 21:24:35.763197 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e\": container with ID starting with 11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e not found: ID does not exist" containerID="11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.763267 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e"} err="failed to get container status \"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e\": rpc error: code = NotFound desc = could not find container \"11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e\": container with ID starting with 11ba4759b9be5df1bb830828671c9f997e1ac97e7855f0f7be573ef7944a560e not found: ID does not exist" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.763312 4747 scope.go:117] "RemoveContainer" containerID="7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f" Dec 05 21:24:35 crc kubenswrapper[4747]: E1205 21:24:35.764033 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f\": container with ID starting with 7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f not found: ID does not exist" containerID="7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.764081 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f"} err="failed to get container status \"7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f\": rpc error: code = NotFound desc = could not find container \"7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f\": container with ID starting with 7b6f206ccb0a8769f50dc057ebb278d97d355168c16a04f58483d65e584a5c3f not found: ID does not exist" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.791962 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdck\" (UniqueName: \"kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck\") pod \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.792103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities\") pod \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.792261 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content\") pod \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\" (UID: \"b53c4a65-50b8-4ed4-a25f-1d08b8a677fb\") " Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.794275 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities" (OuterVolumeSpecName: "utilities") pod "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" (UID: "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.801223 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck" (OuterVolumeSpecName: "kube-api-access-lfdck") pod "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" (UID: "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb"). InnerVolumeSpecName "kube-api-access-lfdck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.869237 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" (UID: "b53c4a65-50b8-4ed4-a25f-1d08b8a677fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.894450 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.894496 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:35 crc kubenswrapper[4747]: I1205 21:24:35.894511 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdck\" (UniqueName: \"kubernetes.io/projected/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb-kube-api-access-lfdck\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:36 crc kubenswrapper[4747]: I1205 21:24:36.015332 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:36 crc kubenswrapper[4747]: I1205 21:24:36.021818 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bxlt4"] Dec 05 21:24:37 crc kubenswrapper[4747]: I1205 21:24:37.856052 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" path="/var/lib/kubelet/pods/b53c4a65-50b8-4ed4-a25f-1d08b8a677fb/volumes" Dec 05 21:24:40 crc kubenswrapper[4747]: I1205 21:24:40.012676 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:40 crc kubenswrapper[4747]: I1205 21:24:40.012748 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:40 crc kubenswrapper[4747]: I1205 21:24:40.078224 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:40 crc kubenswrapper[4747]: I1205 21:24:40.758910 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:40 crc kubenswrapper[4747]: I1205 21:24:40.801386 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:42 crc kubenswrapper[4747]: I1205 21:24:42.741380 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9j2xq" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="registry-server" containerID="cri-o://1ad259ee91b5d23998b3558db8c701dfeaeb20a96e41a00d4c027082881a6831" gracePeriod=2 Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.750495 4747 generic.go:334] "Generic (PLEG): container finished" podID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerID="1ad259ee91b5d23998b3558db8c701dfeaeb20a96e41a00d4c027082881a6831" exitCode=0 Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.750565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerDied","Data":"1ad259ee91b5d23998b3558db8c701dfeaeb20a96e41a00d4c027082881a6831"} Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.750817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j2xq" event={"ID":"fa6e469e-4022-45ca-8175-c58f2d8ed10e","Type":"ContainerDied","Data":"d7b1f83d70a4ef31a61d0babedecfd231d9667893a630465b3098bb3bd087858"} Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.750837 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b1f83d70a4ef31a61d0babedecfd231d9667893a630465b3098bb3bd087858" Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.751501 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.839889 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.917799 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhvzm\" (UniqueName: \"kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm\") pod \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.917928 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities\") pod \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.917990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content\") pod \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\" (UID: \"fa6e469e-4022-45ca-8175-c58f2d8ed10e\") " Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.919787 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities" (OuterVolumeSpecName: "utilities") pod "fa6e469e-4022-45ca-8175-c58f2d8ed10e" (UID: "fa6e469e-4022-45ca-8175-c58f2d8ed10e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.925454 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm" (OuterVolumeSpecName: "kube-api-access-bhvzm") pod "fa6e469e-4022-45ca-8175-c58f2d8ed10e" (UID: "fa6e469e-4022-45ca-8175-c58f2d8ed10e"). InnerVolumeSpecName "kube-api-access-bhvzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:24:43 crc kubenswrapper[4747]: I1205 21:24:43.944319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa6e469e-4022-45ca-8175-c58f2d8ed10e" (UID: "fa6e469e-4022-45ca-8175-c58f2d8ed10e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.020239 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhvzm\" (UniqueName: \"kubernetes.io/projected/fa6e469e-4022-45ca-8175-c58f2d8ed10e-kube-api-access-bhvzm\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.020868 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.020887 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6e469e-4022-45ca-8175-c58f2d8ed10e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.761268 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380"} Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.761322 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j2xq" Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.822384 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:44 crc kubenswrapper[4747]: I1205 21:24:44.827742 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j2xq"] Dec 05 21:24:45 crc kubenswrapper[4747]: I1205 21:24:45.857679 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" path="/var/lib/kubelet/pods/fa6e469e-4022-45ca-8175-c58f2d8ed10e/volumes" Dec 05 21:27:06 crc kubenswrapper[4747]: I1205 21:27:06.222193 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:27:06 crc kubenswrapper[4747]: I1205 21:27:06.222803 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.683388 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684098 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="extract-utilities" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684115 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="extract-utilities" Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684126 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684134 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684148 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684157 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684183 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="extract-content" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684191 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="extract-content" Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684206 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="extract-utilities" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684213 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="extract-utilities" Dec 05 21:27:09 crc kubenswrapper[4747]: E1205 21:27:09.684228 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="extract-content" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684235 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="extract-content" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684390 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6e469e-4022-45ca-8175-c58f2d8ed10e" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.684416 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53c4a65-50b8-4ed4-a25f-1d08b8a677fb" containerName="registry-server" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.685751 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.703849 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.704457 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.704562 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.704624 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvf5d\" (UniqueName: \"kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.805672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.805786 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.805829 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvf5d\" (UniqueName: \"kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.806854 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.807117 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:09 crc kubenswrapper[4747]: I1205 21:27:09.828752 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvf5d\" (UniqueName: \"kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d\") pod \"redhat-operators-w4drf\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:10 crc kubenswrapper[4747]: I1205 21:27:10.023027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:10 crc kubenswrapper[4747]: I1205 21:27:10.459781 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:10 crc kubenswrapper[4747]: W1205 21:27:10.468964 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44dad08d_6809_40e4_b47b_8b1f9faa6192.slice/crio-a7a034c4dfbf3156c53b2087a6cecf63c082d145874abf5092c436858b0c9892 WatchSource:0}: Error finding container a7a034c4dfbf3156c53b2087a6cecf63c082d145874abf5092c436858b0c9892: Status 404 returned error can't find the container with id a7a034c4dfbf3156c53b2087a6cecf63c082d145874abf5092c436858b0c9892 Dec 05 21:27:11 crc kubenswrapper[4747]: I1205 21:27:11.025542 4747 generic.go:334] "Generic (PLEG): container finished" podID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerID="0fa34672212aa64e7535fdca897b9950cfddcc65603546401084b4475301f533" exitCode=0 Dec 05 21:27:11 crc kubenswrapper[4747]: I1205 21:27:11.026756 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerDied","Data":"0fa34672212aa64e7535fdca897b9950cfddcc65603546401084b4475301f533"} Dec 05 21:27:11 crc kubenswrapper[4747]: I1205 21:27:11.027481 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerStarted","Data":"a7a034c4dfbf3156c53b2087a6cecf63c082d145874abf5092c436858b0c9892"} Dec 05 21:27:12 crc kubenswrapper[4747]: I1205 21:27:12.035688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerStarted","Data":"e595e6dbb05e24b2a3828cf81dba9e3873b76c15bcc6f300da5c188a14c4c6c5"} Dec 05 21:27:13 crc kubenswrapper[4747]: I1205 21:27:13.052197 4747 generic.go:334] "Generic (PLEG): container finished" podID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerID="e595e6dbb05e24b2a3828cf81dba9e3873b76c15bcc6f300da5c188a14c4c6c5" exitCode=0 Dec 05 21:27:13 crc kubenswrapper[4747]: I1205 21:27:13.052292 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerDied","Data":"e595e6dbb05e24b2a3828cf81dba9e3873b76c15bcc6f300da5c188a14c4c6c5"} Dec 05 21:27:14 crc kubenswrapper[4747]: I1205 21:27:14.067410 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerStarted","Data":"6edaba4411456a701d8047cf96f1caf27e241e4426e9a89177bd252f348f6afc"} Dec 05 21:27:14 crc kubenswrapper[4747]: I1205 21:27:14.100963 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w4drf" podStartSLOduration=2.662510605 podStartE2EDuration="5.100936713s" podCreationTimestamp="2025-12-05 21:27:09 +0000 UTC" firstStartedPulling="2025-12-05 21:27:11.027741661 +0000 UTC m=+2701.495049189" lastFinishedPulling="2025-12-05 21:27:13.466167769 +0000 UTC m=+2703.933475297" observedRunningTime="2025-12-05 21:27:14.092176965 +0000 UTC m=+2704.559484513" watchObservedRunningTime="2025-12-05 21:27:14.100936713 +0000 UTC m=+2704.568244201" Dec 05 21:27:20 crc kubenswrapper[4747]: I1205 21:27:20.023972 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:20 crc kubenswrapper[4747]: I1205 21:27:20.024452 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:20 crc kubenswrapper[4747]: I1205 21:27:20.099832 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:20 crc kubenswrapper[4747]: I1205 21:27:20.179529 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:20 crc kubenswrapper[4747]: I1205 21:27:20.357251 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:22 crc kubenswrapper[4747]: I1205 21:27:22.132121 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w4drf" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="registry-server" containerID="cri-o://6edaba4411456a701d8047cf96f1caf27e241e4426e9a89177bd252f348f6afc" gracePeriod=2 Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.149893 4747 generic.go:334] "Generic (PLEG): container finished" podID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerID="6edaba4411456a701d8047cf96f1caf27e241e4426e9a89177bd252f348f6afc" exitCode=0 Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.149989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerDied","Data":"6edaba4411456a701d8047cf96f1caf27e241e4426e9a89177bd252f348f6afc"} Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.374293 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.537362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities\") pod \"44dad08d-6809-40e4-b47b-8b1f9faa6192\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.537449 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content\") pod \"44dad08d-6809-40e4-b47b-8b1f9faa6192\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.537501 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvf5d\" (UniqueName: \"kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d\") pod \"44dad08d-6809-40e4-b47b-8b1f9faa6192\" (UID: \"44dad08d-6809-40e4-b47b-8b1f9faa6192\") " Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.538516 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities" (OuterVolumeSpecName: "utilities") pod "44dad08d-6809-40e4-b47b-8b1f9faa6192" (UID: "44dad08d-6809-40e4-b47b-8b1f9faa6192"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.544527 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d" (OuterVolumeSpecName: "kube-api-access-wvf5d") pod "44dad08d-6809-40e4-b47b-8b1f9faa6192" (UID: "44dad08d-6809-40e4-b47b-8b1f9faa6192"). InnerVolumeSpecName "kube-api-access-wvf5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.638488 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvf5d\" (UniqueName: \"kubernetes.io/projected/44dad08d-6809-40e4-b47b-8b1f9faa6192-kube-api-access-wvf5d\") on node \"crc\" DevicePath \"\"" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.638515 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.649683 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44dad08d-6809-40e4-b47b-8b1f9faa6192" (UID: "44dad08d-6809-40e4-b47b-8b1f9faa6192"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:27:24 crc kubenswrapper[4747]: I1205 21:27:24.739962 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44dad08d-6809-40e4-b47b-8b1f9faa6192-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.161473 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w4drf" event={"ID":"44dad08d-6809-40e4-b47b-8b1f9faa6192","Type":"ContainerDied","Data":"a7a034c4dfbf3156c53b2087a6cecf63c082d145874abf5092c436858b0c9892"} Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.161545 4747 scope.go:117] "RemoveContainer" containerID="6edaba4411456a701d8047cf96f1caf27e241e4426e9a89177bd252f348f6afc" Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.167681 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w4drf" Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.208771 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.214119 4747 scope.go:117] "RemoveContainer" containerID="e595e6dbb05e24b2a3828cf81dba9e3873b76c15bcc6f300da5c188a14c4c6c5" Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.217450 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w4drf"] Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.251017 4747 scope.go:117] "RemoveContainer" containerID="0fa34672212aa64e7535fdca897b9950cfddcc65603546401084b4475301f533" Dec 05 21:27:25 crc kubenswrapper[4747]: I1205 21:27:25.855412 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" path="/var/lib/kubelet/pods/44dad08d-6809-40e4-b47b-8b1f9faa6192/volumes" Dec 05 21:27:36 crc kubenswrapper[4747]: I1205 21:27:36.222714 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:27:36 crc kubenswrapper[4747]: I1205 21:27:36.223414 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.221817 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.222759 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.222846 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.223966 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.224095 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380" gracePeriod=600 Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.503177 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380" exitCode=0 Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.503228 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380"} Dec 05 21:28:06 crc kubenswrapper[4747]: I1205 21:28:06.503565 4747 scope.go:117] "RemoveContainer" containerID="86e06aec57a5f2a7de03576fe96d7ee5917c2403c267561c69aea65bc1439d42" Dec 05 21:28:07 crc kubenswrapper[4747]: I1205 21:28:07.518123 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228"} Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.807012 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:31 crc kubenswrapper[4747]: E1205 21:28:31.808137 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="registry-server" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.808169 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="registry-server" Dec 05 21:28:31 crc kubenswrapper[4747]: E1205 21:28:31.808222 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="extract-content" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.808241 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="extract-content" Dec 05 21:28:31 crc kubenswrapper[4747]: E1205 21:28:31.808283 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="extract-utilities" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.808298 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="extract-utilities" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.808559 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="44dad08d-6809-40e4-b47b-8b1f9faa6192" containerName="registry-server" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.810451 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.823116 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.976622 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.976677 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxp95\" (UniqueName: \"kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:31 crc kubenswrapper[4747]: I1205 21:28:31.976731 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.077969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.078049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxp95\" (UniqueName: \"kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.078110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.078532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.078690 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.098994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxp95\" (UniqueName: \"kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95\") pod \"community-operators-2wzsn\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.141085 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:32 crc kubenswrapper[4747]: W1205 21:28:32.578103 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5c1f2c_b76e_45e3_8c66_a1eab3018b8f.slice/crio-1c806f58b4fc927c4c3d00835f3a0842cb59f3ee10b87b52b016fa3818226c99 WatchSource:0}: Error finding container 1c806f58b4fc927c4c3d00835f3a0842cb59f3ee10b87b52b016fa3818226c99: Status 404 returned error can't find the container with id 1c806f58b4fc927c4c3d00835f3a0842cb59f3ee10b87b52b016fa3818226c99 Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.587684 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.755182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerStarted","Data":"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7"} Dec 05 21:28:32 crc kubenswrapper[4747]: I1205 21:28:32.755232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerStarted","Data":"1c806f58b4fc927c4c3d00835f3a0842cb59f3ee10b87b52b016fa3818226c99"} Dec 05 21:28:33 crc kubenswrapper[4747]: I1205 21:28:33.766729 4747 generic.go:334] "Generic (PLEG): container finished" podID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerID="390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7" exitCode=0 Dec 05 21:28:33 crc kubenswrapper[4747]: I1205 21:28:33.766783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerDied","Data":"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7"} Dec 05 21:28:34 crc kubenswrapper[4747]: I1205 21:28:34.784726 4747 generic.go:334] "Generic (PLEG): container finished" podID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerID="b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888" exitCode=0 Dec 05 21:28:34 crc kubenswrapper[4747]: I1205 21:28:34.785086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerDied","Data":"b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888"} Dec 05 21:28:35 crc kubenswrapper[4747]: I1205 21:28:35.798464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerStarted","Data":"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94"} Dec 05 21:28:35 crc kubenswrapper[4747]: I1205 21:28:35.832155 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2wzsn" podStartSLOduration=3.407378719 podStartE2EDuration="4.832127231s" podCreationTimestamp="2025-12-05 21:28:31 +0000 UTC" firstStartedPulling="2025-12-05 21:28:33.768111093 +0000 UTC m=+2784.235418581" lastFinishedPulling="2025-12-05 21:28:35.192859595 +0000 UTC m=+2785.660167093" observedRunningTime="2025-12-05 21:28:35.824706457 +0000 UTC m=+2786.292014015" watchObservedRunningTime="2025-12-05 21:28:35.832127231 +0000 UTC m=+2786.299434759" Dec 05 21:28:42 crc kubenswrapper[4747]: I1205 21:28:42.141959 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:42 crc kubenswrapper[4747]: I1205 21:28:42.142318 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:42 crc kubenswrapper[4747]: I1205 21:28:42.196674 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:42 crc kubenswrapper[4747]: I1205 21:28:42.909767 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:45 crc kubenswrapper[4747]: I1205 21:28:45.792523 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:45 crc kubenswrapper[4747]: I1205 21:28:45.792928 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2wzsn" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="registry-server" containerID="cri-o://c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94" gracePeriod=2 Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.741716 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.895615 4747 generic.go:334] "Generic (PLEG): container finished" podID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerID="c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94" exitCode=0 Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.895649 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2wzsn" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.895685 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerDied","Data":"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94"} Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.895809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2wzsn" event={"ID":"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f","Type":"ContainerDied","Data":"1c806f58b4fc927c4c3d00835f3a0842cb59f3ee10b87b52b016fa3818226c99"} Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.895866 4747 scope.go:117] "RemoveContainer" containerID="c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.911189 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities\") pod \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.911336 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxp95\" (UniqueName: \"kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95\") pod \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.911474 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content\") pod \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\" (UID: \"be5c1f2c-b76e-45e3-8c66-a1eab3018b8f\") " Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.912868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities" (OuterVolumeSpecName: "utilities") pod "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" (UID: "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.917329 4747 scope.go:117] "RemoveContainer" containerID="b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.923804 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95" (OuterVolumeSpecName: "kube-api-access-cxp95") pod "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" (UID: "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f"). InnerVolumeSpecName "kube-api-access-cxp95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.955208 4747 scope.go:117] "RemoveContainer" containerID="390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.966428 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" (UID: "be5c1f2c-b76e-45e3-8c66-a1eab3018b8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.976963 4747 scope.go:117] "RemoveContainer" containerID="c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94" Dec 05 21:28:46 crc kubenswrapper[4747]: E1205 21:28:46.979573 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94\": container with ID starting with c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94 not found: ID does not exist" containerID="c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.979645 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94"} err="failed to get container status \"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94\": rpc error: code = NotFound desc = could not find container \"c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94\": container with ID starting with c8406398e39ec4fca42188e12ec0ab64b1fca73ac1f45b0627196c4325622d94 not found: ID does not exist" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.979684 4747 scope.go:117] "RemoveContainer" containerID="b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888" Dec 05 21:28:46 crc kubenswrapper[4747]: E1205 21:28:46.980005 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888\": container with ID starting with b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888 not found: ID does not exist" containerID="b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.980038 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888"} err="failed to get container status \"b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888\": rpc error: code = NotFound desc = could not find container \"b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888\": container with ID starting with b7b36d277577bd601df0111b54ac2989dea2f9943638be854c45d0148d3a2888 not found: ID does not exist" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.980057 4747 scope.go:117] "RemoveContainer" containerID="390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7" Dec 05 21:28:46 crc kubenswrapper[4747]: E1205 21:28:46.980339 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7\": container with ID starting with 390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7 not found: ID does not exist" containerID="390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7" Dec 05 21:28:46 crc kubenswrapper[4747]: I1205 21:28:46.980367 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7"} err="failed to get container status \"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7\": rpc error: code = NotFound desc = could not find container \"390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7\": container with ID starting with 390f204c98da95bd11f6b0aa4642f1ea9a2390c08e82756ebf936054c9971ae7 not found: ID does not exist" Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.012988 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.013029 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxp95\" (UniqueName: \"kubernetes.io/projected/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-kube-api-access-cxp95\") on node \"crc\" DevicePath \"\"" Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.013042 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.253844 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.265317 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2wzsn"] Dec 05 21:28:47 crc kubenswrapper[4747]: I1205 21:28:47.855371 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" path="/var/lib/kubelet/pods/be5c1f2c-b76e-45e3-8c66-a1eab3018b8f/volumes" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.140118 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx"] Dec 05 21:30:00 crc kubenswrapper[4747]: E1205 21:30:00.140944 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="extract-content" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.140960 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="extract-content" Dec 05 21:30:00 crc kubenswrapper[4747]: E1205 21:30:00.140994 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="extract-utilities" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.141090 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="extract-utilities" Dec 05 21:30:00 crc kubenswrapper[4747]: E1205 21:30:00.141110 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.141118 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.141275 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5c1f2c-b76e-45e3-8c66-a1eab3018b8f" containerName="registry-server" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.141873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.153513 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.153553 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.162132 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx"] Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.298263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.298374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnw2\" (UniqueName: \"kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.298446 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.400377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.400548 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnw2\" (UniqueName: \"kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.400877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.401745 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.407416 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.421649 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnw2\" (UniqueName: \"kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2\") pod \"collect-profiles-29416170-wm2cx\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.465549 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:00 crc kubenswrapper[4747]: I1205 21:30:00.939129 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx"] Dec 05 21:30:01 crc kubenswrapper[4747]: I1205 21:30:01.627987 4747 generic.go:334] "Generic (PLEG): container finished" podID="79224c99-bda2-4fad-9546-96a275fd4329" containerID="bcf788960eccd1b080ae22e4eddcc2cd0d694c3437dd04006cecefc63acf53e6" exitCode=0 Dec 05 21:30:01 crc kubenswrapper[4747]: I1205 21:30:01.628212 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" event={"ID":"79224c99-bda2-4fad-9546-96a275fd4329","Type":"ContainerDied","Data":"bcf788960eccd1b080ae22e4eddcc2cd0d694c3437dd04006cecefc63acf53e6"} Dec 05 21:30:01 crc kubenswrapper[4747]: I1205 21:30:01.629418 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" event={"ID":"79224c99-bda2-4fad-9546-96a275fd4329","Type":"ContainerStarted","Data":"753fd80fbaf60c966e53e37d55613cbce812abff864cc49fb2b8c2feab8bc8cd"} Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.014684 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.150009 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume\") pod \"79224c99-bda2-4fad-9546-96a275fd4329\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.150070 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnw2\" (UniqueName: \"kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2\") pod \"79224c99-bda2-4fad-9546-96a275fd4329\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.150145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume\") pod \"79224c99-bda2-4fad-9546-96a275fd4329\" (UID: \"79224c99-bda2-4fad-9546-96a275fd4329\") " Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.151078 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume" (OuterVolumeSpecName: "config-volume") pod "79224c99-bda2-4fad-9546-96a275fd4329" (UID: "79224c99-bda2-4fad-9546-96a275fd4329"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.161719 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "79224c99-bda2-4fad-9546-96a275fd4329" (UID: "79224c99-bda2-4fad-9546-96a275fd4329"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.161916 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2" (OuterVolumeSpecName: "kube-api-access-wgnw2") pod "79224c99-bda2-4fad-9546-96a275fd4329" (UID: "79224c99-bda2-4fad-9546-96a275fd4329"). InnerVolumeSpecName "kube-api-access-wgnw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.251871 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79224c99-bda2-4fad-9546-96a275fd4329-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.251907 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnw2\" (UniqueName: \"kubernetes.io/projected/79224c99-bda2-4fad-9546-96a275fd4329-kube-api-access-wgnw2\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.251923 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79224c99-bda2-4fad-9546-96a275fd4329-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.648391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" event={"ID":"79224c99-bda2-4fad-9546-96a275fd4329","Type":"ContainerDied","Data":"753fd80fbaf60c966e53e37d55613cbce812abff864cc49fb2b8c2feab8bc8cd"} Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.648448 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="753fd80fbaf60c966e53e37d55613cbce812abff864cc49fb2b8c2feab8bc8cd" Dec 05 21:30:03 crc kubenswrapper[4747]: I1205 21:30:03.648476 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx" Dec 05 21:30:04 crc kubenswrapper[4747]: I1205 21:30:04.126295 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth"] Dec 05 21:30:04 crc kubenswrapper[4747]: I1205 21:30:04.135131 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-jccth"] Dec 05 21:30:05 crc kubenswrapper[4747]: I1205 21:30:05.855149 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfcc851-a0fe-45be-9752-82a11a1d06fa" path="/var/lib/kubelet/pods/6dfcc851-a0fe-45be-9752-82a11a1d06fa/volumes" Dec 05 21:30:06 crc kubenswrapper[4747]: I1205 21:30:06.222514 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:30:06 crc kubenswrapper[4747]: I1205 21:30:06.222624 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:30:27 crc kubenswrapper[4747]: I1205 21:30:27.436495 4747 scope.go:117] "RemoveContainer" containerID="0f778b00b91e75dd01cfde3d64c4a954b25a568b6a159983c2f43dd16d423ea1" Dec 05 21:30:36 crc kubenswrapper[4747]: I1205 21:30:36.222437 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:30:36 crc kubenswrapper[4747]: I1205 21:30:36.223382 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:31:06 crc kubenswrapper[4747]: I1205 21:31:06.221745 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:31:06 crc kubenswrapper[4747]: I1205 21:31:06.222623 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:31:06 crc kubenswrapper[4747]: I1205 21:31:06.222709 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:31:06 crc kubenswrapper[4747]: I1205 21:31:06.223741 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:31:06 crc kubenswrapper[4747]: I1205 21:31:06.223853 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" gracePeriod=600 Dec 05 21:31:06 crc kubenswrapper[4747]: E1205 21:31:06.862364 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:31:07 crc kubenswrapper[4747]: I1205 21:31:07.243512 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" exitCode=0 Dec 05 21:31:07 crc kubenswrapper[4747]: I1205 21:31:07.243548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228"} Dec 05 21:31:07 crc kubenswrapper[4747]: I1205 21:31:07.243597 4747 scope.go:117] "RemoveContainer" containerID="b0efeab699d7dca75ef0d28c3598fecdc35b0d21a1da1c3f65c182d075010380" Dec 05 21:31:07 crc kubenswrapper[4747]: I1205 21:31:07.244043 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:31:07 crc kubenswrapper[4747]: E1205 21:31:07.244221 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:31:18 crc kubenswrapper[4747]: I1205 21:31:18.840052 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:31:18 crc kubenswrapper[4747]: E1205 21:31:18.841110 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:31:27 crc kubenswrapper[4747]: I1205 21:31:27.498952 4747 scope.go:117] "RemoveContainer" containerID="90fdec136eb163b008d998f8e2e1476859bef9302486f2af30bc2df1eebd62b9" Dec 05 21:31:27 crc kubenswrapper[4747]: I1205 21:31:27.523868 4747 scope.go:117] "RemoveContainer" containerID="1ad259ee91b5d23998b3558db8c701dfeaeb20a96e41a00d4c027082881a6831" Dec 05 21:31:27 crc kubenswrapper[4747]: I1205 21:31:27.567485 4747 scope.go:117] "RemoveContainer" containerID="34d72eb1a55001da8e657d75b07e525324c18c5f5a71d1829fdd0033f9ab8c5d" Dec 05 21:31:31 crc kubenswrapper[4747]: I1205 21:31:31.840151 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:31:31 crc kubenswrapper[4747]: E1205 21:31:31.840872 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:31:43 crc kubenswrapper[4747]: I1205 21:31:43.840000 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:31:43 crc kubenswrapper[4747]: E1205 21:31:43.841730 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:31:54 crc kubenswrapper[4747]: I1205 21:31:54.839809 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:31:54 crc kubenswrapper[4747]: E1205 21:31:54.840949 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:32:06 crc kubenswrapper[4747]: I1205 21:32:06.839386 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:32:06 crc kubenswrapper[4747]: E1205 21:32:06.840709 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:32:17 crc kubenswrapper[4747]: I1205 21:32:17.840109 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:32:17 crc kubenswrapper[4747]: E1205 21:32:17.842421 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:32:30 crc kubenswrapper[4747]: I1205 21:32:30.840075 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:32:30 crc kubenswrapper[4747]: E1205 21:32:30.841282 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:32:41 crc kubenswrapper[4747]: I1205 21:32:41.840891 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:32:41 crc kubenswrapper[4747]: E1205 21:32:41.842199 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:32:56 crc kubenswrapper[4747]: I1205 21:32:56.840136 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:32:56 crc kubenswrapper[4747]: E1205 21:32:56.841103 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:33:10 crc kubenswrapper[4747]: I1205 21:33:10.839326 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:33:10 crc kubenswrapper[4747]: E1205 21:33:10.839974 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:33:22 crc kubenswrapper[4747]: I1205 21:33:22.840313 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:33:22 crc kubenswrapper[4747]: E1205 21:33:22.841483 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:33:33 crc kubenswrapper[4747]: I1205 21:33:33.839750 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:33:33 crc kubenswrapper[4747]: E1205 21:33:33.840820 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:33:48 crc kubenswrapper[4747]: I1205 21:33:48.840427 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:33:48 crc kubenswrapper[4747]: E1205 21:33:48.841408 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:34:00 crc kubenswrapper[4747]: I1205 21:34:00.839674 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:34:00 crc kubenswrapper[4747]: E1205 21:34:00.840316 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:34:13 crc kubenswrapper[4747]: I1205 21:34:13.840108 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:34:13 crc kubenswrapper[4747]: E1205 21:34:13.841012 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:34:24 crc kubenswrapper[4747]: I1205 21:34:24.841561 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:34:24 crc kubenswrapper[4747]: E1205 21:34:24.842836 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.725759 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:30 crc kubenswrapper[4747]: E1205 21:34:30.728438 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79224c99-bda2-4fad-9546-96a275fd4329" containerName="collect-profiles" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.728658 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="79224c99-bda2-4fad-9546-96a275fd4329" containerName="collect-profiles" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.729089 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="79224c99-bda2-4fad-9546-96a275fd4329" containerName="collect-profiles" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.731351 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.771035 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.907957 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.908279 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:30 crc kubenswrapper[4747]: I1205 21:34:30.908340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8h7\" (UniqueName: \"kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.010186 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.010259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8h7\" (UniqueName: \"kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.010311 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.010826 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.011197 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.036978 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8h7\" (UniqueName: \"kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7\") pod \"certified-operators-jfxcq\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.058229 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:31 crc kubenswrapper[4747]: I1205 21:34:31.543545 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:32 crc kubenswrapper[4747]: I1205 21:34:32.184975 4747 generic.go:334] "Generic (PLEG): container finished" podID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerID="f8189b1555a52bd1ebd60077bdcc5f60e6e5b9c19bf2e50357f99d090dddca02" exitCode=0 Dec 05 21:34:32 crc kubenswrapper[4747]: I1205 21:34:32.185045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerDied","Data":"f8189b1555a52bd1ebd60077bdcc5f60e6e5b9c19bf2e50357f99d090dddca02"} Dec 05 21:34:32 crc kubenswrapper[4747]: I1205 21:34:32.185095 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerStarted","Data":"c549a0ac4ddc6d797133577eaef53664caf0b38bd62136f3bb078f01dfd4aa3d"} Dec 05 21:34:32 crc kubenswrapper[4747]: I1205 21:34:32.187994 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:34:33 crc kubenswrapper[4747]: I1205 21:34:33.199024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerStarted","Data":"d8032283312e30e34d6c9c73621575091f59ed24eb7e05711162f35572241273"} Dec 05 21:34:34 crc kubenswrapper[4747]: I1205 21:34:34.211527 4747 generic.go:334] "Generic (PLEG): container finished" podID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerID="d8032283312e30e34d6c9c73621575091f59ed24eb7e05711162f35572241273" exitCode=0 Dec 05 21:34:34 crc kubenswrapper[4747]: I1205 21:34:34.211567 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerDied","Data":"d8032283312e30e34d6c9c73621575091f59ed24eb7e05711162f35572241273"} Dec 05 21:34:35 crc kubenswrapper[4747]: I1205 21:34:35.237824 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerStarted","Data":"ee11ae5fb2178d6ccb4e3d41a8ec96172a430789798547886c9ac415ed876c5b"} Dec 05 21:34:35 crc kubenswrapper[4747]: I1205 21:34:35.276270 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfxcq" podStartSLOduration=2.835138894 podStartE2EDuration="5.276241858s" podCreationTimestamp="2025-12-05 21:34:30 +0000 UTC" firstStartedPulling="2025-12-05 21:34:32.187701172 +0000 UTC m=+3142.655008670" lastFinishedPulling="2025-12-05 21:34:34.628804106 +0000 UTC m=+3145.096111634" observedRunningTime="2025-12-05 21:34:35.268511968 +0000 UTC m=+3145.735819516" watchObservedRunningTime="2025-12-05 21:34:35.276241858 +0000 UTC m=+3145.743549386" Dec 05 21:34:35 crc kubenswrapper[4747]: I1205 21:34:35.840616 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:34:35 crc kubenswrapper[4747]: E1205 21:34:35.840996 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:34:41 crc kubenswrapper[4747]: I1205 21:34:41.058338 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:41 crc kubenswrapper[4747]: I1205 21:34:41.058747 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:41 crc kubenswrapper[4747]: I1205 21:34:41.125631 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:41 crc kubenswrapper[4747]: I1205 21:34:41.390707 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:41 crc kubenswrapper[4747]: I1205 21:34:41.437003 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:43 crc kubenswrapper[4747]: I1205 21:34:43.310756 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfxcq" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="registry-server" containerID="cri-o://ee11ae5fb2178d6ccb4e3d41a8ec96172a430789798547886c9ac415ed876c5b" gracePeriod=2 Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.320385 4747 generic.go:334] "Generic (PLEG): container finished" podID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerID="ee11ae5fb2178d6ccb4e3d41a8ec96172a430789798547886c9ac415ed876c5b" exitCode=0 Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.320452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerDied","Data":"ee11ae5fb2178d6ccb4e3d41a8ec96172a430789798547886c9ac415ed876c5b"} Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.795831 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.952517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content\") pod \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.952652 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8h7\" (UniqueName: \"kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7\") pod \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.952719 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities\") pod \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\" (UID: \"ae04e766-7ad5-4fcb-b309-7e1d67d376bb\") " Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.954883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities" (OuterVolumeSpecName: "utilities") pod "ae04e766-7ad5-4fcb-b309-7e1d67d376bb" (UID: "ae04e766-7ad5-4fcb-b309-7e1d67d376bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:34:44 crc kubenswrapper[4747]: I1205 21:34:44.974202 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7" (OuterVolumeSpecName: "kube-api-access-st8h7") pod "ae04e766-7ad5-4fcb-b309-7e1d67d376bb" (UID: "ae04e766-7ad5-4fcb-b309-7e1d67d376bb"). InnerVolumeSpecName "kube-api-access-st8h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.005544 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae04e766-7ad5-4fcb-b309-7e1d67d376bb" (UID: "ae04e766-7ad5-4fcb-b309-7e1d67d376bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.055048 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8h7\" (UniqueName: \"kubernetes.io/projected/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-kube-api-access-st8h7\") on node \"crc\" DevicePath \"\"" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.055113 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.055143 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae04e766-7ad5-4fcb-b309-7e1d67d376bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.335157 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfxcq" event={"ID":"ae04e766-7ad5-4fcb-b309-7e1d67d376bb","Type":"ContainerDied","Data":"c549a0ac4ddc6d797133577eaef53664caf0b38bd62136f3bb078f01dfd4aa3d"} Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.335248 4747 scope.go:117] "RemoveContainer" containerID="ee11ae5fb2178d6ccb4e3d41a8ec96172a430789798547886c9ac415ed876c5b" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.335246 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfxcq" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.368276 4747 scope.go:117] "RemoveContainer" containerID="d8032283312e30e34d6c9c73621575091f59ed24eb7e05711162f35572241273" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.391656 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.406525 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfxcq"] Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.414925 4747 scope.go:117] "RemoveContainer" containerID="f8189b1555a52bd1ebd60077bdcc5f60e6e5b9c19bf2e50357f99d090dddca02" Dec 05 21:34:45 crc kubenswrapper[4747]: I1205 21:34:45.861762 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" path="/var/lib/kubelet/pods/ae04e766-7ad5-4fcb-b309-7e1d67d376bb/volumes" Dec 05 21:34:46 crc kubenswrapper[4747]: I1205 21:34:46.840423 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:34:46 crc kubenswrapper[4747]: E1205 21:34:46.840906 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:35:01 crc kubenswrapper[4747]: I1205 21:35:01.840083 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:35:01 crc kubenswrapper[4747]: E1205 21:35:01.841548 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.545083 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:12 crc kubenswrapper[4747]: E1205 21:35:12.546165 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="extract-content" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.546190 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="extract-content" Dec 05 21:35:12 crc kubenswrapper[4747]: E1205 21:35:12.546234 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="registry-server" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.546242 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="registry-server" Dec 05 21:35:12 crc kubenswrapper[4747]: E1205 21:35:12.546267 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="extract-utilities" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.546275 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="extract-utilities" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.546447 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae04e766-7ad5-4fcb-b309-7e1d67d376bb" containerName="registry-server" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.547780 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.557756 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.742516 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.743467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgz7\" (UniqueName: \"kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.743530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.845375 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.845956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.846326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgz7\" (UniqueName: \"kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.846392 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.846925 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:12 crc kubenswrapper[4747]: I1205 21:35:12.878807 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgz7\" (UniqueName: \"kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7\") pod \"redhat-marketplace-nhqp2\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:13 crc kubenswrapper[4747]: I1205 21:35:13.166352 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:13 crc kubenswrapper[4747]: I1205 21:35:13.600187 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:14 crc kubenswrapper[4747]: I1205 21:35:14.578232 4747 generic.go:334] "Generic (PLEG): container finished" podID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerID="4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed" exitCode=0 Dec 05 21:35:14 crc kubenswrapper[4747]: I1205 21:35:14.578277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerDied","Data":"4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed"} Dec 05 21:35:14 crc kubenswrapper[4747]: I1205 21:35:14.578308 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerStarted","Data":"963c7752b2700c0086649ebcfec61476c1b38b6e8410b32de80d32b4c139256f"} Dec 05 21:35:14 crc kubenswrapper[4747]: I1205 21:35:14.840228 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:35:14 crc kubenswrapper[4747]: E1205 21:35:14.840648 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:35:15 crc kubenswrapper[4747]: I1205 21:35:15.588617 4747 generic.go:334] "Generic (PLEG): container finished" podID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerID="e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0" exitCode=0 Dec 05 21:35:15 crc kubenswrapper[4747]: I1205 21:35:15.588681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerDied","Data":"e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0"} Dec 05 21:35:16 crc kubenswrapper[4747]: I1205 21:35:16.599004 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerStarted","Data":"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa"} Dec 05 21:35:16 crc kubenswrapper[4747]: I1205 21:35:16.622052 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nhqp2" podStartSLOduration=3.211541536 podStartE2EDuration="4.62202367s" podCreationTimestamp="2025-12-05 21:35:12 +0000 UTC" firstStartedPulling="2025-12-05 21:35:14.580446515 +0000 UTC m=+3185.047754013" lastFinishedPulling="2025-12-05 21:35:15.990928619 +0000 UTC m=+3186.458236147" observedRunningTime="2025-12-05 21:35:16.617109469 +0000 UTC m=+3187.084416967" watchObservedRunningTime="2025-12-05 21:35:16.62202367 +0000 UTC m=+3187.089331198" Dec 05 21:35:23 crc kubenswrapper[4747]: I1205 21:35:23.167035 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:23 crc kubenswrapper[4747]: I1205 21:35:23.167715 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:23 crc kubenswrapper[4747]: I1205 21:35:23.227795 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:23 crc kubenswrapper[4747]: I1205 21:35:23.713770 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:23 crc kubenswrapper[4747]: I1205 21:35:23.769036 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:25 crc kubenswrapper[4747]: I1205 21:35:25.692983 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nhqp2" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="registry-server" containerID="cri-o://5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa" gracePeriod=2 Dec 05 21:35:25 crc kubenswrapper[4747]: I1205 21:35:25.840947 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:35:25 crc kubenswrapper[4747]: E1205 21:35:25.841900 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.652413 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.654298 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities\") pod \"677bbe90-6ea0-420c-94f2-462dfa2a0268\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.654383 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content\") pod \"677bbe90-6ea0-420c-94f2-462dfa2a0268\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.654421 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgz7\" (UniqueName: \"kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7\") pod \"677bbe90-6ea0-420c-94f2-462dfa2a0268\" (UID: \"677bbe90-6ea0-420c-94f2-462dfa2a0268\") " Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.659666 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities" (OuterVolumeSpecName: "utilities") pod "677bbe90-6ea0-420c-94f2-462dfa2a0268" (UID: "677bbe90-6ea0-420c-94f2-462dfa2a0268"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.661578 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7" (OuterVolumeSpecName: "kube-api-access-hmgz7") pod "677bbe90-6ea0-420c-94f2-462dfa2a0268" (UID: "677bbe90-6ea0-420c-94f2-462dfa2a0268"). InnerVolumeSpecName "kube-api-access-hmgz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.699938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "677bbe90-6ea0-420c-94f2-462dfa2a0268" (UID: "677bbe90-6ea0-420c-94f2-462dfa2a0268"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.702828 4747 generic.go:334] "Generic (PLEG): container finished" podID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerID="5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa" exitCode=0 Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.702878 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerDied","Data":"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa"} Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.702907 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhqp2" event={"ID":"677bbe90-6ea0-420c-94f2-462dfa2a0268","Type":"ContainerDied","Data":"963c7752b2700c0086649ebcfec61476c1b38b6e8410b32de80d32b4c139256f"} Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.702910 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhqp2" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.702926 4747 scope.go:117] "RemoveContainer" containerID="5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.731073 4747 scope.go:117] "RemoveContainer" containerID="e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.733283 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.739011 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhqp2"] Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.755421 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.755454 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677bbe90-6ea0-420c-94f2-462dfa2a0268-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.755464 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgz7\" (UniqueName: \"kubernetes.io/projected/677bbe90-6ea0-420c-94f2-462dfa2a0268-kube-api-access-hmgz7\") on node \"crc\" DevicePath \"\"" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.758040 4747 scope.go:117] "RemoveContainer" containerID="4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.772675 4747 scope.go:117] "RemoveContainer" containerID="5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa" Dec 05 21:35:26 crc kubenswrapper[4747]: E1205 21:35:26.772979 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa\": container with ID starting with 5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa not found: ID does not exist" containerID="5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.773008 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa"} err="failed to get container status \"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa\": rpc error: code = NotFound desc = could not find container \"5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa\": container with ID starting with 5c044c54c5d9941a2b8b0ebd46d2aba84d35ef5f2e91b58d744ff607736cd9fa not found: ID does not exist" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.773028 4747 scope.go:117] "RemoveContainer" containerID="e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0" Dec 05 21:35:26 crc kubenswrapper[4747]: E1205 21:35:26.773246 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0\": container with ID starting with e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0 not found: ID does not exist" containerID="e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.773268 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0"} err="failed to get container status \"e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0\": rpc error: code = NotFound desc = could not find container \"e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0\": container with ID starting with e6078372be28009ee1141c19494342df171e48843d944a1981347330aa2bcdf0 not found: ID does not exist" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.773281 4747 scope.go:117] "RemoveContainer" containerID="4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed" Dec 05 21:35:26 crc kubenswrapper[4747]: E1205 21:35:26.773563 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed\": container with ID starting with 4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed not found: ID does not exist" containerID="4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed" Dec 05 21:35:26 crc kubenswrapper[4747]: I1205 21:35:26.773601 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed"} err="failed to get container status \"4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed\": rpc error: code = NotFound desc = could not find container \"4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed\": container with ID starting with 4ec614cce47c126b620d0ab598cc2be926af4a4a870221c4b2ce3258834155ed not found: ID does not exist" Dec 05 21:35:27 crc kubenswrapper[4747]: I1205 21:35:27.858208 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" path="/var/lib/kubelet/pods/677bbe90-6ea0-420c-94f2-462dfa2a0268/volumes" Dec 05 21:35:37 crc kubenswrapper[4747]: I1205 21:35:37.839881 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:35:37 crc kubenswrapper[4747]: E1205 21:35:37.840823 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:35:50 crc kubenswrapper[4747]: I1205 21:35:50.840303 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:35:50 crc kubenswrapper[4747]: E1205 21:35:50.841431 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:36:04 crc kubenswrapper[4747]: I1205 21:36:04.840524 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:36:04 crc kubenswrapper[4747]: E1205 21:36:04.842025 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:36:17 crc kubenswrapper[4747]: I1205 21:36:17.840357 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:36:18 crc kubenswrapper[4747]: I1205 21:36:18.225723 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0"} Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.229847 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:19 crc kubenswrapper[4747]: E1205 21:37:19.230992 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="registry-server" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.231011 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="registry-server" Dec 05 21:37:19 crc kubenswrapper[4747]: E1205 21:37:19.231052 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="extract-utilities" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.231061 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="extract-utilities" Dec 05 21:37:19 crc kubenswrapper[4747]: E1205 21:37:19.231078 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="extract-content" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.231085 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="extract-content" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.231404 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="677bbe90-6ea0-420c-94f2-462dfa2a0268" containerName="registry-server" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.234145 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.261016 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.435287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.435331 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65pxr\" (UniqueName: \"kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.435654 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.536940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65pxr\" (UniqueName: \"kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.537069 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.537122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.537689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.537864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.559197 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65pxr\" (UniqueName: \"kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr\") pod \"redhat-operators-hgz94\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.563796 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:19 crc kubenswrapper[4747]: I1205 21:37:19.986087 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:20 crc kubenswrapper[4747]: I1205 21:37:20.776003 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerID="0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318" exitCode=0 Dec 05 21:37:20 crc kubenswrapper[4747]: I1205 21:37:20.776100 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerDied","Data":"0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318"} Dec 05 21:37:20 crc kubenswrapper[4747]: I1205 21:37:20.776263 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerStarted","Data":"37e6a6da7615113c1a483f3945632346ca9ed526a3ede5c0e23602e6e8ac3a76"} Dec 05 21:37:21 crc kubenswrapper[4747]: I1205 21:37:21.784040 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerStarted","Data":"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3"} Dec 05 21:37:22 crc kubenswrapper[4747]: I1205 21:37:22.794573 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerID="b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3" exitCode=0 Dec 05 21:37:22 crc kubenswrapper[4747]: I1205 21:37:22.794636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerDied","Data":"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3"} Dec 05 21:37:23 crc kubenswrapper[4747]: I1205 21:37:23.802865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerStarted","Data":"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c"} Dec 05 21:37:23 crc kubenswrapper[4747]: I1205 21:37:23.819747 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hgz94" podStartSLOduration=2.362992246 podStartE2EDuration="4.819728274s" podCreationTimestamp="2025-12-05 21:37:19 +0000 UTC" firstStartedPulling="2025-12-05 21:37:20.77889114 +0000 UTC m=+3311.246198638" lastFinishedPulling="2025-12-05 21:37:23.235627178 +0000 UTC m=+3313.702934666" observedRunningTime="2025-12-05 21:37:23.817439708 +0000 UTC m=+3314.284747216" watchObservedRunningTime="2025-12-05 21:37:23.819728274 +0000 UTC m=+3314.287035762" Dec 05 21:37:29 crc kubenswrapper[4747]: I1205 21:37:29.564808 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:29 crc kubenswrapper[4747]: I1205 21:37:29.565292 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:29 crc kubenswrapper[4747]: I1205 21:37:29.614792 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:29 crc kubenswrapper[4747]: I1205 21:37:29.899891 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:29 crc kubenswrapper[4747]: I1205 21:37:29.945341 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:31 crc kubenswrapper[4747]: I1205 21:37:31.865778 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hgz94" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="registry-server" containerID="cri-o://7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c" gracePeriod=2 Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.257079 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.427186 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities\") pod \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.427377 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content\") pod \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.427465 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65pxr\" (UniqueName: \"kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr\") pod \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\" (UID: \"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44\") " Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.428933 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities" (OuterVolumeSpecName: "utilities") pod "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" (UID: "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.434013 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr" (OuterVolumeSpecName: "kube-api-access-65pxr") pod "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" (UID: "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44"). InnerVolumeSpecName "kube-api-access-65pxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.529682 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65pxr\" (UniqueName: \"kubernetes.io/projected/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-kube-api-access-65pxr\") on node \"crc\" DevicePath \"\"" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.529715 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.874348 4747 generic.go:334] "Generic (PLEG): container finished" podID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerID="7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c" exitCode=0 Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.874432 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgz94" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.874467 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerDied","Data":"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c"} Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.874934 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgz94" event={"ID":"8e1cc0d3-85ff-43ee-a470-1ceb994e0c44","Type":"ContainerDied","Data":"37e6a6da7615113c1a483f3945632346ca9ed526a3ede5c0e23602e6e8ac3a76"} Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.874967 4747 scope.go:117] "RemoveContainer" containerID="7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.907885 4747 scope.go:117] "RemoveContainer" containerID="b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.941653 4747 scope.go:117] "RemoveContainer" containerID="0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.978515 4747 scope.go:117] "RemoveContainer" containerID="7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c" Dec 05 21:37:32 crc kubenswrapper[4747]: E1205 21:37:32.979142 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c\": container with ID starting with 7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c not found: ID does not exist" containerID="7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.979207 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c"} err="failed to get container status \"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c\": rpc error: code = NotFound desc = could not find container \"7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c\": container with ID starting with 7f04f8e90238ed8818ad02ef790fd79aa5a60a2c17b401c9772a64d838f2bb1c not found: ID does not exist" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.979246 4747 scope.go:117] "RemoveContainer" containerID="b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3" Dec 05 21:37:32 crc kubenswrapper[4747]: E1205 21:37:32.980048 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3\": container with ID starting with b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3 not found: ID does not exist" containerID="b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.980093 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3"} err="failed to get container status \"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3\": rpc error: code = NotFound desc = could not find container \"b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3\": container with ID starting with b773e1db7ecbff6dc368ad3746bac137e26b80de8d5d14fa3b53226ae43c3af3 not found: ID does not exist" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.980118 4747 scope.go:117] "RemoveContainer" containerID="0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318" Dec 05 21:37:32 crc kubenswrapper[4747]: E1205 21:37:32.981083 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318\": container with ID starting with 0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318 not found: ID does not exist" containerID="0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318" Dec 05 21:37:32 crc kubenswrapper[4747]: I1205 21:37:32.981116 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318"} err="failed to get container status \"0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318\": rpc error: code = NotFound desc = could not find container \"0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318\": container with ID starting with 0cc4d589237635d4055e94154cc25e6875004b6f928393010ff503d48fcb5318 not found: ID does not exist" Dec 05 21:37:33 crc kubenswrapper[4747]: I1205 21:37:33.459219 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" (UID: "8e1cc0d3-85ff-43ee-a470-1ceb994e0c44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:37:33 crc kubenswrapper[4747]: I1205 21:37:33.513246 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:33 crc kubenswrapper[4747]: I1205 21:37:33.526270 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hgz94"] Dec 05 21:37:33 crc kubenswrapper[4747]: I1205 21:37:33.542313 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:37:33 crc kubenswrapper[4747]: I1205 21:37:33.851107 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" path="/var/lib/kubelet/pods/8e1cc0d3-85ff-43ee-a470-1ceb994e0c44/volumes" Dec 05 21:38:36 crc kubenswrapper[4747]: I1205 21:38:36.222246 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:38:36 crc kubenswrapper[4747]: I1205 21:38:36.223026 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.784754 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:38:55 crc kubenswrapper[4747]: E1205 21:38:55.786926 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="registry-server" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.786959 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="registry-server" Dec 05 21:38:55 crc kubenswrapper[4747]: E1205 21:38:55.787041 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="extract-content" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.787054 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="extract-content" Dec 05 21:38:55 crc kubenswrapper[4747]: E1205 21:38:55.787086 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="extract-utilities" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.787099 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="extract-utilities" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.787364 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1cc0d3-85ff-43ee-a470-1ceb994e0c44" containerName="registry-server" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.789700 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.806621 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.891394 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.891708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbv8\" (UniqueName: \"kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.892031 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.993620 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.993705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbv8\" (UniqueName: \"kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.993796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.994185 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:55 crc kubenswrapper[4747]: I1205 21:38:55.994345 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.012386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbv8\" (UniqueName: \"kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8\") pod \"community-operators-kw9qw\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.124368 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.408551 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.656435 4747 generic.go:334] "Generic (PLEG): container finished" podID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerID="62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae" exitCode=0 Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.656484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerDied","Data":"62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae"} Dec 05 21:38:56 crc kubenswrapper[4747]: I1205 21:38:56.656527 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerStarted","Data":"ec033c2ee12c61bf84db2132278fa177f4b2b39ea3f9f1607ac7391c7c35c2ae"} Dec 05 21:38:57 crc kubenswrapper[4747]: I1205 21:38:57.668728 4747 generic.go:334] "Generic (PLEG): container finished" podID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerID="831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214" exitCode=0 Dec 05 21:38:57 crc kubenswrapper[4747]: I1205 21:38:57.668843 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerDied","Data":"831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214"} Dec 05 21:38:58 crc kubenswrapper[4747]: I1205 21:38:58.678675 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerStarted","Data":"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0"} Dec 05 21:38:58 crc kubenswrapper[4747]: I1205 21:38:58.707743 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kw9qw" podStartSLOduration=2.31991738 podStartE2EDuration="3.707719217s" podCreationTimestamp="2025-12-05 21:38:55 +0000 UTC" firstStartedPulling="2025-12-05 21:38:56.658147936 +0000 UTC m=+3407.125455424" lastFinishedPulling="2025-12-05 21:38:58.045949733 +0000 UTC m=+3408.513257261" observedRunningTime="2025-12-05 21:38:58.697947657 +0000 UTC m=+3409.165255185" watchObservedRunningTime="2025-12-05 21:38:58.707719217 +0000 UTC m=+3409.175026725" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.125456 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.126441 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.193140 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.227598 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.227666 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.804245 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:06 crc kubenswrapper[4747]: I1205 21:39:06.874897 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:39:08 crc kubenswrapper[4747]: I1205 21:39:08.769068 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kw9qw" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="registry-server" containerID="cri-o://8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0" gracePeriod=2 Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.747391 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.816423 4747 generic.go:334] "Generic (PLEG): container finished" podID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerID="8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0" exitCode=0 Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.816490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerDied","Data":"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0"} Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.816530 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw9qw" event={"ID":"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f","Type":"ContainerDied","Data":"ec033c2ee12c61bf84db2132278fa177f4b2b39ea3f9f1607ac7391c7c35c2ae"} Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.816558 4747 scope.go:117] "RemoveContainer" containerID="8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.816823 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw9qw" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.819075 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content\") pod \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.819133 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities\") pod \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.819194 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbbv8\" (UniqueName: \"kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8\") pod \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\" (UID: \"3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f\") " Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.824329 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities" (OuterVolumeSpecName: "utilities") pod "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" (UID: "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.828876 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8" (OuterVolumeSpecName: "kube-api-access-bbbv8") pod "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" (UID: "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f"). InnerVolumeSpecName "kube-api-access-bbbv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.864995 4747 scope.go:117] "RemoveContainer" containerID="831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.889821 4747 scope.go:117] "RemoveContainer" containerID="62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.897145 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" (UID: "3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.909091 4747 scope.go:117] "RemoveContainer" containerID="8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0" Dec 05 21:39:09 crc kubenswrapper[4747]: E1205 21:39:09.909522 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0\": container with ID starting with 8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0 not found: ID does not exist" containerID="8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.909566 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0"} err="failed to get container status \"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0\": rpc error: code = NotFound desc = could not find container \"8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0\": container with ID starting with 8493c2019fd8b45a4e697fa17e950ab1a66912ae84b120e27f2974b58c0afbe0 not found: ID does not exist" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.909635 4747 scope.go:117] "RemoveContainer" containerID="831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214" Dec 05 21:39:09 crc kubenswrapper[4747]: E1205 21:39:09.910001 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214\": container with ID starting with 831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214 not found: ID does not exist" containerID="831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.910031 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214"} err="failed to get container status \"831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214\": rpc error: code = NotFound desc = could not find container \"831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214\": container with ID starting with 831c5cc0fa93decc9a4113b3f548504c4da81c447ffdf9598876fc87f3dea214 not found: ID does not exist" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.910046 4747 scope.go:117] "RemoveContainer" containerID="62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae" Dec 05 21:39:09 crc kubenswrapper[4747]: E1205 21:39:09.910276 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae\": container with ID starting with 62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae not found: ID does not exist" containerID="62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.910302 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae"} err="failed to get container status \"62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae\": rpc error: code = NotFound desc = could not find container \"62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae\": container with ID starting with 62e1a93e15d13503330a8dda91872b6e42bab0ae29e631e13c1c55e5487119ae not found: ID does not exist" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.921036 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.921080 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbbv8\" (UniqueName: \"kubernetes.io/projected/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-kube-api-access-bbbv8\") on node \"crc\" DevicePath \"\"" Dec 05 21:39:09 crc kubenswrapper[4747]: I1205 21:39:09.921127 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:39:10 crc kubenswrapper[4747]: I1205 21:39:10.173386 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:39:10 crc kubenswrapper[4747]: I1205 21:39:10.179914 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kw9qw"] Dec 05 21:39:11 crc kubenswrapper[4747]: I1205 21:39:11.856068 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" path="/var/lib/kubelet/pods/3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f/volumes" Dec 05 21:39:36 crc kubenswrapper[4747]: I1205 21:39:36.221920 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:39:36 crc kubenswrapper[4747]: I1205 21:39:36.222663 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:39:36 crc kubenswrapper[4747]: I1205 21:39:36.222742 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:39:36 crc kubenswrapper[4747]: I1205 21:39:36.223789 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:39:36 crc kubenswrapper[4747]: I1205 21:39:36.223909 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0" gracePeriod=600 Dec 05 21:39:37 crc kubenswrapper[4747]: I1205 21:39:37.085750 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0" exitCode=0 Dec 05 21:39:37 crc kubenswrapper[4747]: I1205 21:39:37.085881 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0"} Dec 05 21:39:37 crc kubenswrapper[4747]: I1205 21:39:37.086273 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1"} Dec 05 21:39:37 crc kubenswrapper[4747]: I1205 21:39:37.086294 4747 scope.go:117] "RemoveContainer" containerID="a77d3708903fc37712e54a0903d308f993516fe17c78f25a417f334e4907a228" Dec 05 21:41:36 crc kubenswrapper[4747]: I1205 21:41:36.222693 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:41:36 crc kubenswrapper[4747]: I1205 21:41:36.223354 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:42:06 crc kubenswrapper[4747]: I1205 21:42:06.221707 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:42:06 crc kubenswrapper[4747]: I1205 21:42:06.222417 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.222102 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.224723 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.225149 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.226384 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.226774 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" gracePeriod=600 Dec 05 21:42:36 crc kubenswrapper[4747]: E1205 21:42:36.363681 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.798672 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" exitCode=0 Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.798743 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1"} Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.799100 4747 scope.go:117] "RemoveContainer" containerID="d942e8f20f4c086294c518d1fea7b5831ef33c35976c2a9c4cfa97a8cb810ba0" Dec 05 21:42:36 crc kubenswrapper[4747]: I1205 21:42:36.799603 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:42:36 crc kubenswrapper[4747]: E1205 21:42:36.799866 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:42:48 crc kubenswrapper[4747]: I1205 21:42:48.840091 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:42:48 crc kubenswrapper[4747]: E1205 21:42:48.840765 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:43:03 crc kubenswrapper[4747]: I1205 21:43:03.840890 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:43:03 crc kubenswrapper[4747]: E1205 21:43:03.841932 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:43:15 crc kubenswrapper[4747]: I1205 21:43:15.840071 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:43:15 crc kubenswrapper[4747]: E1205 21:43:15.841344 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:43:28 crc kubenswrapper[4747]: I1205 21:43:28.840900 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:43:28 crc kubenswrapper[4747]: E1205 21:43:28.842323 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:43:43 crc kubenswrapper[4747]: I1205 21:43:43.840094 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:43:43 crc kubenswrapper[4747]: E1205 21:43:43.841242 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:43:55 crc kubenswrapper[4747]: I1205 21:43:55.840100 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:43:55 crc kubenswrapper[4747]: E1205 21:43:55.841229 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:44:10 crc kubenswrapper[4747]: I1205 21:44:10.839773 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:44:10 crc kubenswrapper[4747]: E1205 21:44:10.840604 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:44:25 crc kubenswrapper[4747]: I1205 21:44:25.839546 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:44:25 crc kubenswrapper[4747]: E1205 21:44:25.840147 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:44:36 crc kubenswrapper[4747]: I1205 21:44:36.839751 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:44:36 crc kubenswrapper[4747]: E1205 21:44:36.840787 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:44:48 crc kubenswrapper[4747]: I1205 21:44:48.839964 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:44:48 crc kubenswrapper[4747]: E1205 21:44:48.840841 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.200388 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46"] Dec 05 21:45:00 crc kubenswrapper[4747]: E1205 21:45:00.201309 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="extract-content" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.201327 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="extract-content" Dec 05 21:45:00 crc kubenswrapper[4747]: E1205 21:45:00.201343 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="extract-utilities" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.201351 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="extract-utilities" Dec 05 21:45:00 crc kubenswrapper[4747]: E1205 21:45:00.201365 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="registry-server" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.201374 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="registry-server" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.201543 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f99a5c2-640a-4f96-9e4b-59cc4c1bb48f" containerName="registry-server" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.202239 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.206929 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.208416 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.209993 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46"] Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.299736 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.299816 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.299847 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8s4\" (UniqueName: \"kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.401500 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.401559 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.401600 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc8s4\" (UniqueName: \"kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.402469 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.410400 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.420935 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc8s4\" (UniqueName: \"kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4\") pod \"collect-profiles-29416185-hpg46\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.545331 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.839760 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:00 crc kubenswrapper[4747]: E1205 21:45:00.840429 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:00 crc kubenswrapper[4747]: I1205 21:45:00.844861 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46"] Dec 05 21:45:01 crc kubenswrapper[4747]: I1205 21:45:01.088028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" event={"ID":"40576beb-7745-4b88-9260-d1a3ba62e574","Type":"ContainerStarted","Data":"5905b7f242b923e7681a4a4241ce1548d4a97f588ac07a78b632068fe5ca50c1"} Dec 05 21:45:01 crc kubenswrapper[4747]: I1205 21:45:01.088075 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" event={"ID":"40576beb-7745-4b88-9260-d1a3ba62e574","Type":"ContainerStarted","Data":"9a97c33f7d7a1e003e329497a0eeaa22494b5d2edb37495a68a8491607997338"} Dec 05 21:45:01 crc kubenswrapper[4747]: I1205 21:45:01.111125 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" podStartSLOduration=1.111064363 podStartE2EDuration="1.111064363s" podCreationTimestamp="2025-12-05 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 21:45:01.104238846 +0000 UTC m=+3771.571546374" watchObservedRunningTime="2025-12-05 21:45:01.111064363 +0000 UTC m=+3771.578371891" Dec 05 21:45:02 crc kubenswrapper[4747]: I1205 21:45:02.094295 4747 generic.go:334] "Generic (PLEG): container finished" podID="40576beb-7745-4b88-9260-d1a3ba62e574" containerID="5905b7f242b923e7681a4a4241ce1548d4a97f588ac07a78b632068fe5ca50c1" exitCode=0 Dec 05 21:45:02 crc kubenswrapper[4747]: I1205 21:45:02.094347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" event={"ID":"40576beb-7745-4b88-9260-d1a3ba62e574","Type":"ContainerDied","Data":"5905b7f242b923e7681a4a4241ce1548d4a97f588ac07a78b632068fe5ca50c1"} Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.395238 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.546733 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc8s4\" (UniqueName: \"kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4\") pod \"40576beb-7745-4b88-9260-d1a3ba62e574\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.546783 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume\") pod \"40576beb-7745-4b88-9260-d1a3ba62e574\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.546864 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume\") pod \"40576beb-7745-4b88-9260-d1a3ba62e574\" (UID: \"40576beb-7745-4b88-9260-d1a3ba62e574\") " Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.548164 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume" (OuterVolumeSpecName: "config-volume") pod "40576beb-7745-4b88-9260-d1a3ba62e574" (UID: "40576beb-7745-4b88-9260-d1a3ba62e574"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.553567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40576beb-7745-4b88-9260-d1a3ba62e574" (UID: "40576beb-7745-4b88-9260-d1a3ba62e574"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.555802 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4" (OuterVolumeSpecName: "kube-api-access-zc8s4") pod "40576beb-7745-4b88-9260-d1a3ba62e574" (UID: "40576beb-7745-4b88-9260-d1a3ba62e574"). InnerVolumeSpecName "kube-api-access-zc8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.648175 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc8s4\" (UniqueName: \"kubernetes.io/projected/40576beb-7745-4b88-9260-d1a3ba62e574-kube-api-access-zc8s4\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.648219 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40576beb-7745-4b88-9260-d1a3ba62e574-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:03 crc kubenswrapper[4747]: I1205 21:45:03.648239 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40576beb-7745-4b88-9260-d1a3ba62e574-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:04 crc kubenswrapper[4747]: I1205 21:45:04.113084 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" event={"ID":"40576beb-7745-4b88-9260-d1a3ba62e574","Type":"ContainerDied","Data":"9a97c33f7d7a1e003e329497a0eeaa22494b5d2edb37495a68a8491607997338"} Dec 05 21:45:04 crc kubenswrapper[4747]: I1205 21:45:04.113432 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a97c33f7d7a1e003e329497a0eeaa22494b5d2edb37495a68a8491607997338" Dec 05 21:45:04 crc kubenswrapper[4747]: I1205 21:45:04.113131 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46" Dec 05 21:45:04 crc kubenswrapper[4747]: I1205 21:45:04.492463 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4"] Dec 05 21:45:04 crc kubenswrapper[4747]: I1205 21:45:04.503151 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416140-bksd4"] Dec 05 21:45:05 crc kubenswrapper[4747]: I1205 21:45:05.855402 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a040d587-9981-428f-baed-4ad3d3b2ee55" path="/var/lib/kubelet/pods/a040d587-9981-428f-baed-4ad3d3b2ee55/volumes" Dec 05 21:45:11 crc kubenswrapper[4747]: I1205 21:45:11.839547 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:11 crc kubenswrapper[4747]: E1205 21:45:11.840368 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:22 crc kubenswrapper[4747]: I1205 21:45:22.839746 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:22 crc kubenswrapper[4747]: E1205 21:45:22.840686 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:27 crc kubenswrapper[4747]: I1205 21:45:27.925956 4747 scope.go:117] "RemoveContainer" containerID="1b5b4f35f9829a20a7db089517598a56b6654ddd04f79022b9d16e99739c9335" Dec 05 21:45:28 crc kubenswrapper[4747]: I1205 21:45:28.972770 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:28 crc kubenswrapper[4747]: E1205 21:45:28.973407 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40576beb-7745-4b88-9260-d1a3ba62e574" containerName="collect-profiles" Dec 05 21:45:28 crc kubenswrapper[4747]: I1205 21:45:28.973439 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="40576beb-7745-4b88-9260-d1a3ba62e574" containerName="collect-profiles" Dec 05 21:45:28 crc kubenswrapper[4747]: I1205 21:45:28.973817 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="40576beb-7745-4b88-9260-d1a3ba62e574" containerName="collect-profiles" Dec 05 21:45:28 crc kubenswrapper[4747]: I1205 21:45:28.977529 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.025784 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.149359 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.149495 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.149526 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crbnk\" (UniqueName: \"kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.250465 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.250551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.250574 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crbnk\" (UniqueName: \"kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.251379 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.251448 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.270889 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crbnk\" (UniqueName: \"kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk\") pod \"redhat-marketplace-ln58b\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.302907 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:29 crc kubenswrapper[4747]: I1205 21:45:29.762255 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:30 crc kubenswrapper[4747]: I1205 21:45:30.336116 4747 generic.go:334] "Generic (PLEG): container finished" podID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerID="82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd" exitCode=0 Dec 05 21:45:30 crc kubenswrapper[4747]: I1205 21:45:30.336513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerDied","Data":"82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd"} Dec 05 21:45:30 crc kubenswrapper[4747]: I1205 21:45:30.336553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerStarted","Data":"48f228eed09a9d083a8bd19e92d090471f4cc2388c2ec9157df3085a31416835"} Dec 05 21:45:30 crc kubenswrapper[4747]: I1205 21:45:30.341103 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.355376 4747 generic.go:334] "Generic (PLEG): container finished" podID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerID="c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60" exitCode=0 Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.355434 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerDied","Data":"c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60"} Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.373541 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.378743 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.387914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj9c\" (UniqueName: \"kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.387986 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.388021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.394297 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.489133 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj9c\" (UniqueName: \"kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.489178 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.489197 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.489657 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.490381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.515748 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj9c\" (UniqueName: \"kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c\") pod \"certified-operators-4sbm2\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:31 crc kubenswrapper[4747]: I1205 21:45:31.723328 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.183469 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:32 crc kubenswrapper[4747]: W1205 21:45:32.191013 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33915fdb_23b0_4e02_ab20_19d5452e09ba.slice/crio-f870e06af4ec864402e3f0cba3f2d6a52b3e57d5ed71232e5e550759057817e4 WatchSource:0}: Error finding container f870e06af4ec864402e3f0cba3f2d6a52b3e57d5ed71232e5e550759057817e4: Status 404 returned error can't find the container with id f870e06af4ec864402e3f0cba3f2d6a52b3e57d5ed71232e5e550759057817e4 Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.364144 4747 generic.go:334] "Generic (PLEG): container finished" podID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerID="37ea742b45b792fc6922b438dfdfd41f6565ff5c3b47d6ed719e42d6845c3681" exitCode=0 Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.364515 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerDied","Data":"37ea742b45b792fc6922b438dfdfd41f6565ff5c3b47d6ed719e42d6845c3681"} Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.364559 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerStarted","Data":"f870e06af4ec864402e3f0cba3f2d6a52b3e57d5ed71232e5e550759057817e4"} Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.368922 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerStarted","Data":"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7"} Dec 05 21:45:32 crc kubenswrapper[4747]: I1205 21:45:32.408697 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ln58b" podStartSLOduration=2.966843908 podStartE2EDuration="4.408676581s" podCreationTimestamp="2025-12-05 21:45:28 +0000 UTC" firstStartedPulling="2025-12-05 21:45:30.340565979 +0000 UTC m=+3800.807873497" lastFinishedPulling="2025-12-05 21:45:31.782398682 +0000 UTC m=+3802.249706170" observedRunningTime="2025-12-05 21:45:32.401105575 +0000 UTC m=+3802.868413083" watchObservedRunningTime="2025-12-05 21:45:32.408676581 +0000 UTC m=+3802.875984079" Dec 05 21:45:33 crc kubenswrapper[4747]: I1205 21:45:33.377498 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerStarted","Data":"d88aab0349643e3955a1d79a66a09e81d9a0704f5606ce55965ee70a9369ee09"} Dec 05 21:45:34 crc kubenswrapper[4747]: I1205 21:45:34.391006 4747 generic.go:334] "Generic (PLEG): container finished" podID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerID="d88aab0349643e3955a1d79a66a09e81d9a0704f5606ce55965ee70a9369ee09" exitCode=0 Dec 05 21:45:34 crc kubenswrapper[4747]: I1205 21:45:34.391084 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerDied","Data":"d88aab0349643e3955a1d79a66a09e81d9a0704f5606ce55965ee70a9369ee09"} Dec 05 21:45:34 crc kubenswrapper[4747]: I1205 21:45:34.840021 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:34 crc kubenswrapper[4747]: E1205 21:45:34.840298 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:35 crc kubenswrapper[4747]: I1205 21:45:35.403079 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerStarted","Data":"ff08ed95d857ee6cfc26d8060149ccee38e88861c4a84ed1eea9f9ae6addfe92"} Dec 05 21:45:35 crc kubenswrapper[4747]: I1205 21:45:35.433328 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4sbm2" podStartSLOduration=1.985770474 podStartE2EDuration="4.43330795s" podCreationTimestamp="2025-12-05 21:45:31 +0000 UTC" firstStartedPulling="2025-12-05 21:45:32.365860652 +0000 UTC m=+3802.833168140" lastFinishedPulling="2025-12-05 21:45:34.813398128 +0000 UTC m=+3805.280705616" observedRunningTime="2025-12-05 21:45:35.424355881 +0000 UTC m=+3805.891663389" watchObservedRunningTime="2025-12-05 21:45:35.43330795 +0000 UTC m=+3805.900615458" Dec 05 21:45:39 crc kubenswrapper[4747]: I1205 21:45:39.303415 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:39 crc kubenswrapper[4747]: I1205 21:45:39.304282 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:39 crc kubenswrapper[4747]: I1205 21:45:39.360166 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:39 crc kubenswrapper[4747]: I1205 21:45:39.476162 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:39 crc kubenswrapper[4747]: I1205 21:45:39.600186 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:41 crc kubenswrapper[4747]: I1205 21:45:41.450816 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ln58b" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="registry-server" containerID="cri-o://90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7" gracePeriod=2 Dec 05 21:45:41 crc kubenswrapper[4747]: I1205 21:45:41.723482 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:41 crc kubenswrapper[4747]: I1205 21:45:41.723828 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:41 crc kubenswrapper[4747]: I1205 21:45:41.778889 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.429010 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.459426 4747 generic.go:334] "Generic (PLEG): container finished" podID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerID="90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7" exitCode=0 Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.459505 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln58b" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.459531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerDied","Data":"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7"} Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.459589 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln58b" event={"ID":"d598a617-d7ff-433a-9efb-f0d844f6ad27","Type":"ContainerDied","Data":"48f228eed09a9d083a8bd19e92d090471f4cc2388c2ec9157df3085a31416835"} Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.459609 4747 scope.go:117] "RemoveContainer" containerID="90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.482225 4747 scope.go:117] "RemoveContainer" containerID="c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.504181 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.511575 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crbnk\" (UniqueName: \"kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk\") pod \"d598a617-d7ff-433a-9efb-f0d844f6ad27\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.511762 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content\") pod \"d598a617-d7ff-433a-9efb-f0d844f6ad27\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.511851 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities\") pod \"d598a617-d7ff-433a-9efb-f0d844f6ad27\" (UID: \"d598a617-d7ff-433a-9efb-f0d844f6ad27\") " Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.512782 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities" (OuterVolumeSpecName: "utilities") pod "d598a617-d7ff-433a-9efb-f0d844f6ad27" (UID: "d598a617-d7ff-433a-9efb-f0d844f6ad27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.516601 4747 scope.go:117] "RemoveContainer" containerID="82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.517232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk" (OuterVolumeSpecName: "kube-api-access-crbnk") pod "d598a617-d7ff-433a-9efb-f0d844f6ad27" (UID: "d598a617-d7ff-433a-9efb-f0d844f6ad27"). InnerVolumeSpecName "kube-api-access-crbnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.550099 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d598a617-d7ff-433a-9efb-f0d844f6ad27" (UID: "d598a617-d7ff-433a-9efb-f0d844f6ad27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.555962 4747 scope.go:117] "RemoveContainer" containerID="90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7" Dec 05 21:45:42 crc kubenswrapper[4747]: E1205 21:45:42.556425 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7\": container with ID starting with 90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7 not found: ID does not exist" containerID="90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.556466 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7"} err="failed to get container status \"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7\": rpc error: code = NotFound desc = could not find container \"90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7\": container with ID starting with 90c82e248b302ad8e1cd7579d05d7508d21387187be1fa1129b911587afa3ab7 not found: ID does not exist" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.556491 4747 scope.go:117] "RemoveContainer" containerID="c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60" Dec 05 21:45:42 crc kubenswrapper[4747]: E1205 21:45:42.556766 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60\": container with ID starting with c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60 not found: ID does not exist" containerID="c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.556813 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60"} err="failed to get container status \"c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60\": rpc error: code = NotFound desc = could not find container \"c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60\": container with ID starting with c78559ff846c7fb7d2404ba6f4ae5887ebb650203114a2943db25eea4ea51a60 not found: ID does not exist" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.556846 4747 scope.go:117] "RemoveContainer" containerID="82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd" Dec 05 21:45:42 crc kubenswrapper[4747]: E1205 21:45:42.557263 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd\": container with ID starting with 82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd not found: ID does not exist" containerID="82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.557303 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd"} err="failed to get container status \"82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd\": rpc error: code = NotFound desc = could not find container \"82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd\": container with ID starting with 82df4511b3843cdf06c1fd4cc3f1a48c7cb58f5d9a465e12d639dcd50f78e0fd not found: ID does not exist" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.613483 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crbnk\" (UniqueName: \"kubernetes.io/projected/d598a617-d7ff-433a-9efb-f0d844f6ad27-kube-api-access-crbnk\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.613517 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.613526 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d598a617-d7ff-433a-9efb-f0d844f6ad27-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.819278 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:42 crc kubenswrapper[4747]: I1205 21:45:42.827067 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln58b"] Dec 05 21:45:43 crc kubenswrapper[4747]: I1205 21:45:43.850627 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" path="/var/lib/kubelet/pods/d598a617-d7ff-433a-9efb-f0d844f6ad27/volumes" Dec 05 21:45:44 crc kubenswrapper[4747]: I1205 21:45:44.800453 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:44 crc kubenswrapper[4747]: I1205 21:45:44.800769 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4sbm2" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="registry-server" containerID="cri-o://ff08ed95d857ee6cfc26d8060149ccee38e88861c4a84ed1eea9f9ae6addfe92" gracePeriod=2 Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.493694 4747 generic.go:334] "Generic (PLEG): container finished" podID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerID="ff08ed95d857ee6cfc26d8060149ccee38e88861c4a84ed1eea9f9ae6addfe92" exitCode=0 Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.493733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerDied","Data":"ff08ed95d857ee6cfc26d8060149ccee38e88861c4a84ed1eea9f9ae6addfe92"} Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.742203 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.767402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content\") pod \"33915fdb-23b0-4e02-ab20-19d5452e09ba\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.767495 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj9c\" (UniqueName: \"kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c\") pod \"33915fdb-23b0-4e02-ab20-19d5452e09ba\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.767624 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities\") pod \"33915fdb-23b0-4e02-ab20-19d5452e09ba\" (UID: \"33915fdb-23b0-4e02-ab20-19d5452e09ba\") " Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.768866 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities" (OuterVolumeSpecName: "utilities") pod "33915fdb-23b0-4e02-ab20-19d5452e09ba" (UID: "33915fdb-23b0-4e02-ab20-19d5452e09ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.773234 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c" (OuterVolumeSpecName: "kube-api-access-vbj9c") pod "33915fdb-23b0-4e02-ab20-19d5452e09ba" (UID: "33915fdb-23b0-4e02-ab20-19d5452e09ba"). InnerVolumeSpecName "kube-api-access-vbj9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.846658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33915fdb-23b0-4e02-ab20-19d5452e09ba" (UID: "33915fdb-23b0-4e02-ab20-19d5452e09ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.868926 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.868957 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33915fdb-23b0-4e02-ab20-19d5452e09ba-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:45 crc kubenswrapper[4747]: I1205 21:45:45.868968 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj9c\" (UniqueName: \"kubernetes.io/projected/33915fdb-23b0-4e02-ab20-19d5452e09ba-kube-api-access-vbj9c\") on node \"crc\" DevicePath \"\"" Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.502906 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sbm2" event={"ID":"33915fdb-23b0-4e02-ab20-19d5452e09ba","Type":"ContainerDied","Data":"f870e06af4ec864402e3f0cba3f2d6a52b3e57d5ed71232e5e550759057817e4"} Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.502999 4747 scope.go:117] "RemoveContainer" containerID="ff08ed95d857ee6cfc26d8060149ccee38e88861c4a84ed1eea9f9ae6addfe92" Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.503240 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sbm2" Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.528129 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.528250 4747 scope.go:117] "RemoveContainer" containerID="d88aab0349643e3955a1d79a66a09e81d9a0704f5606ce55965ee70a9369ee09" Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.538364 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4sbm2"] Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.547699 4747 scope.go:117] "RemoveContainer" containerID="37ea742b45b792fc6922b438dfdfd41f6565ff5c3b47d6ed719e42d6845c3681" Dec 05 21:45:46 crc kubenswrapper[4747]: I1205 21:45:46.840905 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:46 crc kubenswrapper[4747]: E1205 21:45:46.841114 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:45:47 crc kubenswrapper[4747]: I1205 21:45:47.865517 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" path="/var/lib/kubelet/pods/33915fdb-23b0-4e02-ab20-19d5452e09ba/volumes" Dec 05 21:45:57 crc kubenswrapper[4747]: I1205 21:45:57.840228 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:45:57 crc kubenswrapper[4747]: E1205 21:45:57.841645 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:46:10 crc kubenswrapper[4747]: I1205 21:46:10.839895 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:46:10 crc kubenswrapper[4747]: E1205 21:46:10.841070 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:46:25 crc kubenswrapper[4747]: I1205 21:46:25.840532 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:46:25 crc kubenswrapper[4747]: E1205 21:46:25.841803 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:46:38 crc kubenswrapper[4747]: I1205 21:46:38.840059 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:46:38 crc kubenswrapper[4747]: E1205 21:46:38.841110 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:46:52 crc kubenswrapper[4747]: I1205 21:46:52.840282 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:46:52 crc kubenswrapper[4747]: E1205 21:46:52.842284 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:47:04 crc kubenswrapper[4747]: I1205 21:47:04.839943 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:47:04 crc kubenswrapper[4747]: E1205 21:47:04.842723 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:47:15 crc kubenswrapper[4747]: I1205 21:47:15.840826 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:47:15 crc kubenswrapper[4747]: E1205 21:47:15.841621 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:47:30 crc kubenswrapper[4747]: I1205 21:47:30.840531 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:47:30 crc kubenswrapper[4747]: E1205 21:47:30.841448 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:47:45 crc kubenswrapper[4747]: I1205 21:47:45.839932 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:47:46 crc kubenswrapper[4747]: I1205 21:47:46.564161 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94"} Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.669474 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.670880 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.670927 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.670972 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.670985 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.671005 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="extract-content" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671017 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="extract-content" Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.671040 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="extract-utilities" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671052 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="extract-utilities" Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.671079 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="extract-utilities" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671091 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="extract-utilities" Dec 05 21:50:05 crc kubenswrapper[4747]: E1205 21:50:05.671117 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="extract-content" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671129 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="extract-content" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671397 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d598a617-d7ff-433a-9efb-f0d844f6ad27" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.671457 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="33915fdb-23b0-4e02-ab20-19d5452e09ba" containerName="registry-server" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.673185 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.693498 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.843422 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q8m2\" (UniqueName: \"kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.843518 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.843554 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.944593 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q8m2\" (UniqueName: \"kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.944660 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.944681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.945167 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.945571 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.972160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q8m2\" (UniqueName: \"kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2\") pod \"community-operators-btsmg\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:05 crc kubenswrapper[4747]: I1205 21:50:05.997604 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.222004 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.222082 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.615390 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.833204 4747 generic.go:334] "Generic (PLEG): container finished" podID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerID="9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c" exitCode=0 Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.833251 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerDied","Data":"9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c"} Dec 05 21:50:06 crc kubenswrapper[4747]: I1205 21:50:06.833277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerStarted","Data":"d81823893c42d9f1de15437db2fb51600f18f388a821d9e30cb42952e9640385"} Dec 05 21:50:08 crc kubenswrapper[4747]: I1205 21:50:08.905334 4747 generic.go:334] "Generic (PLEG): container finished" podID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerID="1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061" exitCode=0 Dec 05 21:50:08 crc kubenswrapper[4747]: I1205 21:50:08.905402 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerDied","Data":"1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061"} Dec 05 21:50:09 crc kubenswrapper[4747]: I1205 21:50:09.916918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerStarted","Data":"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629"} Dec 05 21:50:09 crc kubenswrapper[4747]: I1205 21:50:09.950636 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-btsmg" podStartSLOduration=2.475654374 podStartE2EDuration="4.950570125s" podCreationTimestamp="2025-12-05 21:50:05 +0000 UTC" firstStartedPulling="2025-12-05 21:50:06.834515306 +0000 UTC m=+4077.301822794" lastFinishedPulling="2025-12-05 21:50:09.309431027 +0000 UTC m=+4079.776738545" observedRunningTime="2025-12-05 21:50:09.945366805 +0000 UTC m=+4080.412674303" watchObservedRunningTime="2025-12-05 21:50:09.950570125 +0000 UTC m=+4080.417877653" Dec 05 21:50:16 crc kubenswrapper[4747]: I1205 21:50:15.998092 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:16 crc kubenswrapper[4747]: I1205 21:50:15.998774 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:16 crc kubenswrapper[4747]: I1205 21:50:16.095691 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:17 crc kubenswrapper[4747]: I1205 21:50:17.031623 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:17 crc kubenswrapper[4747]: I1205 21:50:17.086904 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:19 crc kubenswrapper[4747]: I1205 21:50:19.010363 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-btsmg" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="registry-server" containerID="cri-o://59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629" gracePeriod=2 Dec 05 21:50:19 crc kubenswrapper[4747]: I1205 21:50:19.966620 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.021548 4747 generic.go:334] "Generic (PLEG): container finished" podID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerID="59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629" exitCode=0 Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.021655 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btsmg" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.021650 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerDied","Data":"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629"} Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.021844 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btsmg" event={"ID":"e667353a-03d7-41ba-9719-eb6b0dc1abe9","Type":"ContainerDied","Data":"d81823893c42d9f1de15437db2fb51600f18f388a821d9e30cb42952e9640385"} Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.021878 4747 scope.go:117] "RemoveContainer" containerID="59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.048332 4747 scope.go:117] "RemoveContainer" containerID="1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.064696 4747 scope.go:117] "RemoveContainer" containerID="9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.112458 4747 scope.go:117] "RemoveContainer" containerID="59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629" Dec 05 21:50:20 crc kubenswrapper[4747]: E1205 21:50:20.113346 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629\": container with ID starting with 59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629 not found: ID does not exist" containerID="59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.113446 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629"} err="failed to get container status \"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629\": rpc error: code = NotFound desc = could not find container \"59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629\": container with ID starting with 59dc949486d58d3356da3e477d059bbfef727b09e47ea393f6528107f4451629 not found: ID does not exist" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.113529 4747 scope.go:117] "RemoveContainer" containerID="1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061" Dec 05 21:50:20 crc kubenswrapper[4747]: E1205 21:50:20.114224 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061\": container with ID starting with 1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061 not found: ID does not exist" containerID="1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.114290 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061"} err="failed to get container status \"1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061\": rpc error: code = NotFound desc = could not find container \"1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061\": container with ID starting with 1fe7b80937239ccf03e9e4a49e7317b9a93f439fd3d55a946c0ef13ad81ad061 not found: ID does not exist" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.114332 4747 scope.go:117] "RemoveContainer" containerID="9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c" Dec 05 21:50:20 crc kubenswrapper[4747]: E1205 21:50:20.114762 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c\": container with ID starting with 9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c not found: ID does not exist" containerID="9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.114845 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c"} err="failed to get container status \"9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c\": rpc error: code = NotFound desc = could not find container \"9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c\": container with ID starting with 9dd00782b228dbe3a0178ad5b5fe617ab6eeb11723ae1d29c767c57cce67a52c not found: ID does not exist" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.166740 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q8m2\" (UniqueName: \"kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2\") pod \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.167229 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities\") pod \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.167280 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content\") pod \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\" (UID: \"e667353a-03d7-41ba-9719-eb6b0dc1abe9\") " Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.169266 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities" (OuterVolumeSpecName: "utilities") pod "e667353a-03d7-41ba-9719-eb6b0dc1abe9" (UID: "e667353a-03d7-41ba-9719-eb6b0dc1abe9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.174092 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2" (OuterVolumeSpecName: "kube-api-access-2q8m2") pod "e667353a-03d7-41ba-9719-eb6b0dc1abe9" (UID: "e667353a-03d7-41ba-9719-eb6b0dc1abe9"). InnerVolumeSpecName "kube-api-access-2q8m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.236319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e667353a-03d7-41ba-9719-eb6b0dc1abe9" (UID: "e667353a-03d7-41ba-9719-eb6b0dc1abe9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.269745 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q8m2\" (UniqueName: \"kubernetes.io/projected/e667353a-03d7-41ba-9719-eb6b0dc1abe9-kube-api-access-2q8m2\") on node \"crc\" DevicePath \"\"" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.269777 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.269789 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e667353a-03d7-41ba-9719-eb6b0dc1abe9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.368544 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:20 crc kubenswrapper[4747]: I1205 21:50:20.375533 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-btsmg"] Dec 05 21:50:21 crc kubenswrapper[4747]: I1205 21:50:21.852916 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" path="/var/lib/kubelet/pods/e667353a-03d7-41ba-9719-eb6b0dc1abe9/volumes" Dec 05 21:50:36 crc kubenswrapper[4747]: I1205 21:50:36.221675 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:50:36 crc kubenswrapper[4747]: I1205 21:50:36.222370 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.221688 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.222335 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.222399 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.223195 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.223299 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94" gracePeriod=600 Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.435157 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94" exitCode=0 Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.435274 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94"} Dec 05 21:51:06 crc kubenswrapper[4747]: I1205 21:51:06.435524 4747 scope.go:117] "RemoveContainer" containerID="c63970aef806a3223274baa95126252ea2fa114a126b6c7c2dfe4def8b023fa1" Dec 05 21:51:07 crc kubenswrapper[4747]: I1205 21:51:07.447165 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150"} Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.011899 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:52:42 crc kubenswrapper[4747]: E1205 21:52:42.012917 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="extract-utilities" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.012939 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="extract-utilities" Dec 05 21:52:42 crc kubenswrapper[4747]: E1205 21:52:42.012968 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="extract-content" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.012980 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="extract-content" Dec 05 21:52:42 crc kubenswrapper[4747]: E1205 21:52:42.012997 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="registry-server" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.013008 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="registry-server" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.013211 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e667353a-03d7-41ba-9719-eb6b0dc1abe9" containerName="registry-server" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.015645 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.023536 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.071423 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.071657 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.071698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdrg\" (UniqueName: \"kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.172614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdrg\" (UniqueName: \"kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.175559 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.175819 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.176162 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.176445 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.199022 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdrg\" (UniqueName: \"kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg\") pod \"redhat-operators-2czt4\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.355895 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:42 crc kubenswrapper[4747]: I1205 21:52:42.798236 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:52:43 crc kubenswrapper[4747]: I1205 21:52:43.488487 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerID="6b6e572e92a25ac2d7a71fcc550577b2a0fafbe24549f09fc2a0a8ae7d75dcea" exitCode=0 Dec 05 21:52:43 crc kubenswrapper[4747]: I1205 21:52:43.488569 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerDied","Data":"6b6e572e92a25ac2d7a71fcc550577b2a0fafbe24549f09fc2a0a8ae7d75dcea"} Dec 05 21:52:43 crc kubenswrapper[4747]: I1205 21:52:43.488674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerStarted","Data":"29e59c837806d30ad3b1c0d51e2d6a8fe2dd5a2ccdfc81b0b78dcd7e9f4f0e4b"} Dec 05 21:52:43 crc kubenswrapper[4747]: I1205 21:52:43.493182 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 21:52:44 crc kubenswrapper[4747]: I1205 21:52:44.500742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerStarted","Data":"9880d6d938e061831dbb7e52b6b7177ee1dbc20fad161dd042516935a9f5a2c1"} Dec 05 21:52:45 crc kubenswrapper[4747]: I1205 21:52:45.511985 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerID="9880d6d938e061831dbb7e52b6b7177ee1dbc20fad161dd042516935a9f5a2c1" exitCode=0 Dec 05 21:52:45 crc kubenswrapper[4747]: I1205 21:52:45.512081 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerDied","Data":"9880d6d938e061831dbb7e52b6b7177ee1dbc20fad161dd042516935a9f5a2c1"} Dec 05 21:52:46 crc kubenswrapper[4747]: I1205 21:52:46.525451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerStarted","Data":"d2ef933a2624eaa12b2a6dcb503b343be0111f206907041ff1d3c3561d19b76a"} Dec 05 21:52:46 crc kubenswrapper[4747]: I1205 21:52:46.563522 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2czt4" podStartSLOduration=3.073505014 podStartE2EDuration="5.563494662s" podCreationTimestamp="2025-12-05 21:52:41 +0000 UTC" firstStartedPulling="2025-12-05 21:52:43.492679514 +0000 UTC m=+4233.959987042" lastFinishedPulling="2025-12-05 21:52:45.982669162 +0000 UTC m=+4236.449976690" observedRunningTime="2025-12-05 21:52:46.553398169 +0000 UTC m=+4237.020705677" watchObservedRunningTime="2025-12-05 21:52:46.563494662 +0000 UTC m=+4237.030802190" Dec 05 21:52:52 crc kubenswrapper[4747]: I1205 21:52:52.356640 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:52 crc kubenswrapper[4747]: I1205 21:52:52.358130 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:52:53 crc kubenswrapper[4747]: I1205 21:52:53.416894 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2czt4" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="registry-server" probeResult="failure" output=< Dec 05 21:52:53 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 21:52:53 crc kubenswrapper[4747]: > Dec 05 21:53:02 crc kubenswrapper[4747]: I1205 21:53:02.419262 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:53:02 crc kubenswrapper[4747]: I1205 21:53:02.495470 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:53:02 crc kubenswrapper[4747]: I1205 21:53:02.686167 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:53:03 crc kubenswrapper[4747]: I1205 21:53:03.701870 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2czt4" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="registry-server" containerID="cri-o://d2ef933a2624eaa12b2a6dcb503b343be0111f206907041ff1d3c3561d19b76a" gracePeriod=2 Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.713061 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerID="d2ef933a2624eaa12b2a6dcb503b343be0111f206907041ff1d3c3561d19b76a" exitCode=0 Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.713175 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerDied","Data":"d2ef933a2624eaa12b2a6dcb503b343be0111f206907041ff1d3c3561d19b76a"} Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.713487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2czt4" event={"ID":"e5f57ebe-5abf-4224-b459-d5bc2059cf1e","Type":"ContainerDied","Data":"29e59c837806d30ad3b1c0d51e2d6a8fe2dd5a2ccdfc81b0b78dcd7e9f4f0e4b"} Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.713512 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29e59c837806d30ad3b1c0d51e2d6a8fe2dd5a2ccdfc81b0b78dcd7e9f4f0e4b" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.766623 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.831867 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities\") pod \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.831920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdrg\" (UniqueName: \"kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg\") pod \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.831984 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content\") pod \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\" (UID: \"e5f57ebe-5abf-4224-b459-d5bc2059cf1e\") " Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.833121 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities" (OuterVolumeSpecName: "utilities") pod "e5f57ebe-5abf-4224-b459-d5bc2059cf1e" (UID: "e5f57ebe-5abf-4224-b459-d5bc2059cf1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.839140 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg" (OuterVolumeSpecName: "kube-api-access-csdrg") pod "e5f57ebe-5abf-4224-b459-d5bc2059cf1e" (UID: "e5f57ebe-5abf-4224-b459-d5bc2059cf1e"). InnerVolumeSpecName "kube-api-access-csdrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.932753 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdrg\" (UniqueName: \"kubernetes.io/projected/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-kube-api-access-csdrg\") on node \"crc\" DevicePath \"\"" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.932789 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:53:04 crc kubenswrapper[4747]: I1205 21:53:04.977273 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5f57ebe-5abf-4224-b459-d5bc2059cf1e" (UID: "e5f57ebe-5abf-4224-b459-d5bc2059cf1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:53:05 crc kubenswrapper[4747]: I1205 21:53:05.033788 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f57ebe-5abf-4224-b459-d5bc2059cf1e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:53:05 crc kubenswrapper[4747]: I1205 21:53:05.725423 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2czt4" Dec 05 21:53:05 crc kubenswrapper[4747]: I1205 21:53:05.788176 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:53:05 crc kubenswrapper[4747]: I1205 21:53:05.799429 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2czt4"] Dec 05 21:53:05 crc kubenswrapper[4747]: I1205 21:53:05.850130 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" path="/var/lib/kubelet/pods/e5f57ebe-5abf-4224-b459-d5bc2059cf1e/volumes" Dec 05 21:53:06 crc kubenswrapper[4747]: I1205 21:53:06.222484 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:53:06 crc kubenswrapper[4747]: I1205 21:53:06.222571 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:53:36 crc kubenswrapper[4747]: I1205 21:53:36.222323 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:53:36 crc kubenswrapper[4747]: I1205 21:53:36.223002 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:54:06 crc kubenswrapper[4747]: I1205 21:54:06.222266 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 21:54:06 crc kubenswrapper[4747]: I1205 21:54:06.223050 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 21:54:06 crc kubenswrapper[4747]: I1205 21:54:06.223135 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 21:54:06 crc kubenswrapper[4747]: I1205 21:54:06.224038 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 21:54:06 crc kubenswrapper[4747]: I1205 21:54:06.224141 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" gracePeriod=600 Dec 05 21:54:06 crc kubenswrapper[4747]: E1205 21:54:06.376309 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:54:07 crc kubenswrapper[4747]: I1205 21:54:07.349860 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" exitCode=0 Dec 05 21:54:07 crc kubenswrapper[4747]: I1205 21:54:07.349872 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150"} Dec 05 21:54:07 crc kubenswrapper[4747]: I1205 21:54:07.350017 4747 scope.go:117] "RemoveContainer" containerID="8f1faf92325343c4891cdfd005cd84821af706e9b387417c7aa528f901ddce94" Dec 05 21:54:07 crc kubenswrapper[4747]: I1205 21:54:07.350947 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:54:07 crc kubenswrapper[4747]: E1205 21:54:07.351534 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:54:17 crc kubenswrapper[4747]: I1205 21:54:17.840267 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:54:17 crc kubenswrapper[4747]: E1205 21:54:17.841350 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:54:32 crc kubenswrapper[4747]: I1205 21:54:32.840115 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:54:32 crc kubenswrapper[4747]: E1205 21:54:32.841213 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:54:43 crc kubenswrapper[4747]: I1205 21:54:43.840540 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:54:43 crc kubenswrapper[4747]: E1205 21:54:43.841678 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:54:56 crc kubenswrapper[4747]: I1205 21:54:56.840003 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:54:56 crc kubenswrapper[4747]: E1205 21:54:56.840798 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:07 crc kubenswrapper[4747]: I1205 21:55:07.839277 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:55:07 crc kubenswrapper[4747]: E1205 21:55:07.841143 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:19 crc kubenswrapper[4747]: I1205 21:55:19.847402 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:55:19 crc kubenswrapper[4747]: E1205 21:55:19.849252 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:31 crc kubenswrapper[4747]: I1205 21:55:31.840311 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:55:31 crc kubenswrapper[4747]: E1205 21:55:31.841289 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:43 crc kubenswrapper[4747]: I1205 21:55:43.839962 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:55:43 crc kubenswrapper[4747]: E1205 21:55:43.841788 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.370926 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:55:53 crc kubenswrapper[4747]: E1205 21:55:53.372315 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="registry-server" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.372352 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="registry-server" Dec 05 21:55:53 crc kubenswrapper[4747]: E1205 21:55:53.372389 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="extract-utilities" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.372407 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="extract-utilities" Dec 05 21:55:53 crc kubenswrapper[4747]: E1205 21:55:53.372445 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="extract-content" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.372464 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="extract-content" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.373025 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f57ebe-5abf-4224-b459-d5bc2059cf1e" containerName="registry-server" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.378282 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.381045 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.558223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.558678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.558861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ncnw\" (UniqueName: \"kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.660370 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.660742 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.660883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ncnw\" (UniqueName: \"kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.661085 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.661370 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.698458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ncnw\" (UniqueName: \"kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw\") pod \"certified-operators-tv85j\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:53 crc kubenswrapper[4747]: I1205 21:55:53.711865 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:55:54 crc kubenswrapper[4747]: I1205 21:55:54.167846 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:55:54 crc kubenswrapper[4747]: I1205 21:55:54.252830 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerStarted","Data":"fedf405d189bbf66796c9e275e6e36d183db5e8ed991645af0b86b8dab3e8af6"} Dec 05 21:55:55 crc kubenswrapper[4747]: I1205 21:55:55.265146 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerID="29ae2fa993ee78577c1d55d34d5a50c203d0ae6366aa21ffcc6f2f4d809f5b35" exitCode=0 Dec 05 21:55:55 crc kubenswrapper[4747]: I1205 21:55:55.265229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerDied","Data":"29ae2fa993ee78577c1d55d34d5a50c203d0ae6366aa21ffcc6f2f4d809f5b35"} Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.740775 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.743188 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.751766 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.918735 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.919093 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:56 crc kubenswrapper[4747]: I1205 21:55:56.919295 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm4ld\" (UniqueName: \"kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.020937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.021053 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.021197 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm4ld\" (UniqueName: \"kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.021551 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.021763 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.059108 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm4ld\" (UniqueName: \"kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld\") pod \"redhat-marketplace-fp629\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.129998 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.291908 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerID="eaabe47fc01af01b261666a292874de1ffa62f7735d045ab0776dafbdee48e48" exitCode=0 Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.292031 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerDied","Data":"eaabe47fc01af01b261666a292874de1ffa62f7735d045ab0776dafbdee48e48"} Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.602174 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:55:57 crc kubenswrapper[4747]: I1205 21:55:57.839210 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:55:57 crc kubenswrapper[4747]: E1205 21:55:57.839689 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:55:58 crc kubenswrapper[4747]: I1205 21:55:58.306816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerStarted","Data":"25b9e22d501ccc6d42a1304c7bd59729f70a89e61cf6af74738595250f0f2e34"} Dec 05 21:55:58 crc kubenswrapper[4747]: I1205 21:55:58.309058 4747 generic.go:334] "Generic (PLEG): container finished" podID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerID="b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4" exitCode=0 Dec 05 21:55:58 crc kubenswrapper[4747]: I1205 21:55:58.309102 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerDied","Data":"b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4"} Dec 05 21:55:58 crc kubenswrapper[4747]: I1205 21:55:58.309150 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerStarted","Data":"673e3979581fb43e0909341aa296fa1c8c656f081e67a329cc5deeea0c650abc"} Dec 05 21:55:58 crc kubenswrapper[4747]: I1205 21:55:58.345786 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tv85j" podStartSLOduration=2.876766819 podStartE2EDuration="5.345756273s" podCreationTimestamp="2025-12-05 21:55:53 +0000 UTC" firstStartedPulling="2025-12-05 21:55:55.267273253 +0000 UTC m=+4425.734580781" lastFinishedPulling="2025-12-05 21:55:57.736262747 +0000 UTC m=+4428.203570235" observedRunningTime="2025-12-05 21:55:58.338407959 +0000 UTC m=+4428.805715477" watchObservedRunningTime="2025-12-05 21:55:58.345756273 +0000 UTC m=+4428.813063791" Dec 05 21:55:59 crc kubenswrapper[4747]: I1205 21:55:59.317308 4747 generic.go:334] "Generic (PLEG): container finished" podID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerID="70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b" exitCode=0 Dec 05 21:55:59 crc kubenswrapper[4747]: I1205 21:55:59.317400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerDied","Data":"70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b"} Dec 05 21:56:00 crc kubenswrapper[4747]: I1205 21:56:00.331056 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerStarted","Data":"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d"} Dec 05 21:56:00 crc kubenswrapper[4747]: I1205 21:56:00.354969 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fp629" podStartSLOduration=2.937422013 podStartE2EDuration="4.354939524s" podCreationTimestamp="2025-12-05 21:55:56 +0000 UTC" firstStartedPulling="2025-12-05 21:55:58.310758207 +0000 UTC m=+4428.778065705" lastFinishedPulling="2025-12-05 21:55:59.728275718 +0000 UTC m=+4430.195583216" observedRunningTime="2025-12-05 21:56:00.349321414 +0000 UTC m=+4430.816628912" watchObservedRunningTime="2025-12-05 21:56:00.354939524 +0000 UTC m=+4430.822247022" Dec 05 21:56:03 crc kubenswrapper[4747]: I1205 21:56:03.712364 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:03 crc kubenswrapper[4747]: I1205 21:56:03.712612 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:03 crc kubenswrapper[4747]: I1205 21:56:03.791928 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:04 crc kubenswrapper[4747]: I1205 21:56:04.439063 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:05 crc kubenswrapper[4747]: I1205 21:56:05.938855 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:56:06 crc kubenswrapper[4747]: I1205 21:56:06.403909 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tv85j" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="registry-server" containerID="cri-o://25b9e22d501ccc6d42a1304c7bd59729f70a89e61cf6af74738595250f0f2e34" gracePeriod=2 Dec 05 21:56:07 crc kubenswrapper[4747]: I1205 21:56:07.130892 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:07 crc kubenswrapper[4747]: I1205 21:56:07.131201 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:07 crc kubenswrapper[4747]: I1205 21:56:07.177614 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:07 crc kubenswrapper[4747]: I1205 21:56:07.483449 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.424703 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerID="25b9e22d501ccc6d42a1304c7bd59729f70a89e61cf6af74738595250f0f2e34" exitCode=0 Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.424808 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerDied","Data":"25b9e22d501ccc6d42a1304c7bd59729f70a89e61cf6af74738595250f0f2e34"} Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.547135 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.698856 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ncnw\" (UniqueName: \"kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw\") pod \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.699407 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities\") pod \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.699505 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content\") pod \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\" (UID: \"1f916ec8-7f47-402a-8a3f-fb774e03c9b1\") " Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.701301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities" (OuterVolumeSpecName: "utilities") pod "1f916ec8-7f47-402a-8a3f-fb774e03c9b1" (UID: "1f916ec8-7f47-402a-8a3f-fb774e03c9b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.712944 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw" (OuterVolumeSpecName: "kube-api-access-7ncnw") pod "1f916ec8-7f47-402a-8a3f-fb774e03c9b1" (UID: "1f916ec8-7f47-402a-8a3f-fb774e03c9b1"). InnerVolumeSpecName "kube-api-access-7ncnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.745784 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f916ec8-7f47-402a-8a3f-fb774e03c9b1" (UID: "1f916ec8-7f47-402a-8a3f-fb774e03c9b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.801955 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.802021 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ncnw\" (UniqueName: \"kubernetes.io/projected/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-kube-api-access-7ncnw\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:08 crc kubenswrapper[4747]: I1205 21:56:08.802043 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f916ec8-7f47-402a-8a3f-fb774e03c9b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.438688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tv85j" event={"ID":"1f916ec8-7f47-402a-8a3f-fb774e03c9b1","Type":"ContainerDied","Data":"fedf405d189bbf66796c9e275e6e36d183db5e8ed991645af0b86b8dab3e8af6"} Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.438743 4747 scope.go:117] "RemoveContainer" containerID="25b9e22d501ccc6d42a1304c7bd59729f70a89e61cf6af74738595250f0f2e34" Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.438791 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tv85j" Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.485573 4747 scope.go:117] "RemoveContainer" containerID="eaabe47fc01af01b261666a292874de1ffa62f7735d045ab0776dafbdee48e48" Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.485608 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.499036 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tv85j"] Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.504086 4747 scope.go:117] "RemoveContainer" containerID="29ae2fa993ee78577c1d55d34d5a50c203d0ae6366aa21ffcc6f2f4d809f5b35" Dec 05 21:56:09 crc kubenswrapper[4747]: I1205 21:56:09.857907 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" path="/var/lib/kubelet/pods/1f916ec8-7f47-402a-8a3f-fb774e03c9b1/volumes" Dec 05 21:56:10 crc kubenswrapper[4747]: I1205 21:56:10.840448 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:56:10 crc kubenswrapper[4747]: E1205 21:56:10.841220 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:56:11 crc kubenswrapper[4747]: I1205 21:56:11.940930 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:56:11 crc kubenswrapper[4747]: I1205 21:56:11.941216 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fp629" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="registry-server" containerID="cri-o://ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d" gracePeriod=2 Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.320359 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.458785 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities\") pod \"d2d30f66-dd41-47b9-9697-daad9f731b37\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.459918 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities" (OuterVolumeSpecName: "utilities") pod "d2d30f66-dd41-47b9-9697-daad9f731b37" (UID: "d2d30f66-dd41-47b9-9697-daad9f731b37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.459999 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content\") pod \"d2d30f66-dd41-47b9-9697-daad9f731b37\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.460275 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm4ld\" (UniqueName: \"kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld\") pod \"d2d30f66-dd41-47b9-9697-daad9f731b37\" (UID: \"d2d30f66-dd41-47b9-9697-daad9f731b37\") " Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.461071 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.472276 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld" (OuterVolumeSpecName: "kube-api-access-vm4ld") pod "d2d30f66-dd41-47b9-9697-daad9f731b37" (UID: "d2d30f66-dd41-47b9-9697-daad9f731b37"). InnerVolumeSpecName "kube-api-access-vm4ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.474758 4747 generic.go:334] "Generic (PLEG): container finished" podID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerID="ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d" exitCode=0 Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.474816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerDied","Data":"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d"} Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.474858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp629" event={"ID":"d2d30f66-dd41-47b9-9697-daad9f731b37","Type":"ContainerDied","Data":"673e3979581fb43e0909341aa296fa1c8c656f081e67a329cc5deeea0c650abc"} Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.474889 4747 scope.go:117] "RemoveContainer" containerID="ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.474958 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp629" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.486343 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2d30f66-dd41-47b9-9697-daad9f731b37" (UID: "d2d30f66-dd41-47b9-9697-daad9f731b37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.511000 4747 scope.go:117] "RemoveContainer" containerID="70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.526706 4747 scope.go:117] "RemoveContainer" containerID="b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.548453 4747 scope.go:117] "RemoveContainer" containerID="ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d" Dec 05 21:56:12 crc kubenswrapper[4747]: E1205 21:56:12.549018 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d\": container with ID starting with ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d not found: ID does not exist" containerID="ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.549067 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d"} err="failed to get container status \"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d\": rpc error: code = NotFound desc = could not find container \"ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d\": container with ID starting with ee198c69519e2f2fca492b1eecc06c380bd9ac5e1b2948b87e01425c29d3481d not found: ID does not exist" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.549095 4747 scope.go:117] "RemoveContainer" containerID="70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b" Dec 05 21:56:12 crc kubenswrapper[4747]: E1205 21:56:12.549495 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b\": container with ID starting with 70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b not found: ID does not exist" containerID="70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.549522 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b"} err="failed to get container status \"70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b\": rpc error: code = NotFound desc = could not find container \"70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b\": container with ID starting with 70ed6997e5b142607a19d301c096891d09df9198ff069d299c79176520a03b3b not found: ID does not exist" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.549535 4747 scope.go:117] "RemoveContainer" containerID="b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4" Dec 05 21:56:12 crc kubenswrapper[4747]: E1205 21:56:12.550095 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4\": container with ID starting with b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4 not found: ID does not exist" containerID="b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.550139 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4"} err="failed to get container status \"b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4\": rpc error: code = NotFound desc = could not find container \"b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4\": container with ID starting with b5ea48b2eb4330e30f4288a8842ea4ad7ceb58fab9b89bb02adf9b7415bd88a4 not found: ID does not exist" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.562218 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d30f66-dd41-47b9-9697-daad9f731b37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.562368 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm4ld\" (UniqueName: \"kubernetes.io/projected/d2d30f66-dd41-47b9-9697-daad9f731b37-kube-api-access-vm4ld\") on node \"crc\" DevicePath \"\"" Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.817471 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:56:12 crc kubenswrapper[4747]: I1205 21:56:12.824773 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp629"] Dec 05 21:56:13 crc kubenswrapper[4747]: I1205 21:56:13.855746 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" path="/var/lib/kubelet/pods/d2d30f66-dd41-47b9-9697-daad9f731b37/volumes" Dec 05 21:56:21 crc kubenswrapper[4747]: I1205 21:56:21.840439 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:56:21 crc kubenswrapper[4747]: E1205 21:56:21.841748 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:56:35 crc kubenswrapper[4747]: I1205 21:56:35.840931 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:56:35 crc kubenswrapper[4747]: E1205 21:56:35.842464 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:56:47 crc kubenswrapper[4747]: I1205 21:56:47.840457 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:56:47 crc kubenswrapper[4747]: E1205 21:56:47.841634 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:57:00 crc kubenswrapper[4747]: I1205 21:57:00.840982 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:57:00 crc kubenswrapper[4747]: E1205 21:57:00.842082 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:57:13 crc kubenswrapper[4747]: I1205 21:57:13.841409 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:57:13 crc kubenswrapper[4747]: E1205 21:57:13.842406 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:57:27 crc kubenswrapper[4747]: I1205 21:57:27.839834 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:57:27 crc kubenswrapper[4747]: E1205 21:57:27.842927 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:57:42 crc kubenswrapper[4747]: I1205 21:57:42.839185 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:57:42 crc kubenswrapper[4747]: E1205 21:57:42.840107 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:57:55 crc kubenswrapper[4747]: I1205 21:57:55.840786 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:57:55 crc kubenswrapper[4747]: E1205 21:57:55.841937 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:58:06 crc kubenswrapper[4747]: I1205 21:58:06.840252 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:58:06 crc kubenswrapper[4747]: E1205 21:58:06.841304 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:58:21 crc kubenswrapper[4747]: I1205 21:58:21.839867 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:58:21 crc kubenswrapper[4747]: E1205 21:58:21.840494 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:58:36 crc kubenswrapper[4747]: I1205 21:58:36.839999 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:58:36 crc kubenswrapper[4747]: E1205 21:58:36.840940 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:58:50 crc kubenswrapper[4747]: I1205 21:58:50.839903 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:58:50 crc kubenswrapper[4747]: E1205 21:58:50.840866 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:59:01 crc kubenswrapper[4747]: I1205 21:59:01.839995 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:59:01 crc kubenswrapper[4747]: E1205 21:59:01.842004 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 21:59:14 crc kubenswrapper[4747]: I1205 21:59:14.839956 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 21:59:15 crc kubenswrapper[4747]: I1205 21:59:15.121078 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da"} Dec 05 21:59:28 crc kubenswrapper[4747]: I1205 21:59:28.313181 4747 scope.go:117] "RemoveContainer" containerID="d2ef933a2624eaa12b2a6dcb503b343be0111f206907041ff1d3c3561d19b76a" Dec 05 21:59:28 crc kubenswrapper[4747]: I1205 21:59:28.339701 4747 scope.go:117] "RemoveContainer" containerID="6b6e572e92a25ac2d7a71fcc550577b2a0fafbe24549f09fc2a0a8ae7d75dcea" Dec 05 21:59:28 crc kubenswrapper[4747]: I1205 21:59:28.354092 4747 scope.go:117] "RemoveContainer" containerID="9880d6d938e061831dbb7e52b6b7177ee1dbc20fad161dd042516935a9f5a2c1" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.188058 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp"] Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.188974 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.188994 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.189008 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189016 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.189037 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189046 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.189059 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189069 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.189103 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189111 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="extract-utilities" Dec 05 22:00:00 crc kubenswrapper[4747]: E1205 22:00:00.189128 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189135 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="extract-content" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189319 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f916ec8-7f47-402a-8a3f-fb774e03c9b1" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189344 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d30f66-dd41-47b9-9697-daad9f731b37" containerName="registry-server" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.189994 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.192652 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.193209 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.209453 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp"] Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.299165 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.299245 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.299417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whw7\" (UniqueName: \"kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.401116 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whw7\" (UniqueName: \"kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.401199 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.401231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.402257 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.409780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.417564 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whw7\" (UniqueName: \"kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7\") pod \"collect-profiles-29416200-55mpp\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:00 crc kubenswrapper[4747]: I1205 22:00:00.534146 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:01 crc kubenswrapper[4747]: I1205 22:00:01.010656 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp"] Dec 05 22:00:01 crc kubenswrapper[4747]: I1205 22:00:01.572935 4747 generic.go:334] "Generic (PLEG): container finished" podID="d3754eb2-0273-45eb-8121-cf06e629ea9b" containerID="0cd9464809c8414dbd6295011a20d4651ba0666075fe742473aef8fc9962dc38" exitCode=0 Dec 05 22:00:01 crc kubenswrapper[4747]: I1205 22:00:01.573065 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" event={"ID":"d3754eb2-0273-45eb-8121-cf06e629ea9b","Type":"ContainerDied","Data":"0cd9464809c8414dbd6295011a20d4651ba0666075fe742473aef8fc9962dc38"} Dec 05 22:00:01 crc kubenswrapper[4747]: I1205 22:00:01.575194 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" event={"ID":"d3754eb2-0273-45eb-8121-cf06e629ea9b","Type":"ContainerStarted","Data":"4c77c5538debe13411b13d54428a7d12b0ead3ea4a56e6a44a851340ee2a62ed"} Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.466490 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.555654 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume\") pod \"d3754eb2-0273-45eb-8121-cf06e629ea9b\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.555756 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whw7\" (UniqueName: \"kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7\") pod \"d3754eb2-0273-45eb-8121-cf06e629ea9b\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.555799 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume\") pod \"d3754eb2-0273-45eb-8121-cf06e629ea9b\" (UID: \"d3754eb2-0273-45eb-8121-cf06e629ea9b\") " Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.556658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d3754eb2-0273-45eb-8121-cf06e629ea9b" (UID: "d3754eb2-0273-45eb-8121-cf06e629ea9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.561608 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d3754eb2-0273-45eb-8121-cf06e629ea9b" (UID: "d3754eb2-0273-45eb-8121-cf06e629ea9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.561619 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7" (OuterVolumeSpecName: "kube-api-access-9whw7") pod "d3754eb2-0273-45eb-8121-cf06e629ea9b" (UID: "d3754eb2-0273-45eb-8121-cf06e629ea9b"). InnerVolumeSpecName "kube-api-access-9whw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.602420 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" event={"ID":"d3754eb2-0273-45eb-8121-cf06e629ea9b","Type":"ContainerDied","Data":"4c77c5538debe13411b13d54428a7d12b0ead3ea4a56e6a44a851340ee2a62ed"} Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.602484 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c77c5538debe13411b13d54428a7d12b0ead3ea4a56e6a44a851340ee2a62ed" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.602510 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.658126 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3754eb2-0273-45eb-8121-cf06e629ea9b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.658170 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whw7\" (UniqueName: \"kubernetes.io/projected/d3754eb2-0273-45eb-8121-cf06e629ea9b-kube-api-access-9whw7\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:03 crc kubenswrapper[4747]: I1205 22:00:03.658182 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3754eb2-0273-45eb-8121-cf06e629ea9b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:00:04 crc kubenswrapper[4747]: I1205 22:00:04.554650 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8"] Dec 05 22:00:04 crc kubenswrapper[4747]: I1205 22:00:04.565227 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416155-j69p8"] Dec 05 22:00:05 crc kubenswrapper[4747]: I1205 22:00:05.857345 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbe0335-bf8b-42a9-9bbc-271a8b384552" path="/var/lib/kubelet/pods/1dbe0335-bf8b-42a9-9bbc-271a8b384552/volumes" Dec 05 22:00:28 crc kubenswrapper[4747]: I1205 22:00:28.400100 4747 scope.go:117] "RemoveContainer" containerID="a2ce5a2160284096c01d20b843db02a5d0abc4c2245f0a93699eb8e2926e7ca6" Dec 05 22:01:25 crc kubenswrapper[4747]: I1205 22:01:25.941926 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:25 crc kubenswrapper[4747]: E1205 22:01:25.942820 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3754eb2-0273-45eb-8121-cf06e629ea9b" containerName="collect-profiles" Dec 05 22:01:25 crc kubenswrapper[4747]: I1205 22:01:25.942838 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3754eb2-0273-45eb-8121-cf06e629ea9b" containerName="collect-profiles" Dec 05 22:01:25 crc kubenswrapper[4747]: I1205 22:01:25.943004 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3754eb2-0273-45eb-8121-cf06e629ea9b" containerName="collect-profiles" Dec 05 22:01:25 crc kubenswrapper[4747]: I1205 22:01:25.944250 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:25 crc kubenswrapper[4747]: I1205 22:01:25.977967 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.080292 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.080340 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.080886 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hwh\" (UniqueName: \"kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.182481 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.182535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.182600 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hwh\" (UniqueName: \"kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.183032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.183086 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.207049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hwh\" (UniqueName: \"kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh\") pod \"community-operators-2rft9\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.282705 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:26 crc kubenswrapper[4747]: I1205 22:01:26.803496 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:26 crc kubenswrapper[4747]: W1205 22:01:26.809318 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6916b8f7_3fe5_4412_b5b6_ecd63414667e.slice/crio-2ef543194ab62255dd6698069c49c619e521ab3766e6cf10f7104704aff29732 WatchSource:0}: Error finding container 2ef543194ab62255dd6698069c49c619e521ab3766e6cf10f7104704aff29732: Status 404 returned error can't find the container with id 2ef543194ab62255dd6698069c49c619e521ab3766e6cf10f7104704aff29732 Dec 05 22:01:27 crc kubenswrapper[4747]: I1205 22:01:27.341380 4747 generic.go:334] "Generic (PLEG): container finished" podID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerID="b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b" exitCode=0 Dec 05 22:01:27 crc kubenswrapper[4747]: I1205 22:01:27.343126 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerDied","Data":"b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b"} Dec 05 22:01:27 crc kubenswrapper[4747]: I1205 22:01:27.344827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerStarted","Data":"2ef543194ab62255dd6698069c49c619e521ab3766e6cf10f7104704aff29732"} Dec 05 22:01:27 crc kubenswrapper[4747]: I1205 22:01:27.344527 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:01:28 crc kubenswrapper[4747]: I1205 22:01:28.350826 4747 generic.go:334] "Generic (PLEG): container finished" podID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerID="9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05" exitCode=0 Dec 05 22:01:28 crc kubenswrapper[4747]: I1205 22:01:28.350870 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerDied","Data":"9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05"} Dec 05 22:01:29 crc kubenswrapper[4747]: I1205 22:01:29.362809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerStarted","Data":"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300"} Dec 05 22:01:29 crc kubenswrapper[4747]: I1205 22:01:29.382631 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2rft9" podStartSLOduration=2.721727194 podStartE2EDuration="4.382613338s" podCreationTimestamp="2025-12-05 22:01:25 +0000 UTC" firstStartedPulling="2025-12-05 22:01:27.344242656 +0000 UTC m=+4757.811550154" lastFinishedPulling="2025-12-05 22:01:29.00512881 +0000 UTC m=+4759.472436298" observedRunningTime="2025-12-05 22:01:29.379662664 +0000 UTC m=+4759.846970202" watchObservedRunningTime="2025-12-05 22:01:29.382613338 +0000 UTC m=+4759.849920826" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.221951 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.222528 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.283178 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.283337 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.365449 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.490139 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:36 crc kubenswrapper[4747]: I1205 22:01:36.620213 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:38 crc kubenswrapper[4747]: I1205 22:01:38.440782 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2rft9" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="registry-server" containerID="cri-o://ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300" gracePeriod=2 Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.347301 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.451378 4747 generic.go:334] "Generic (PLEG): container finished" podID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerID="ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300" exitCode=0 Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.451421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerDied","Data":"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300"} Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.451457 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rft9" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.451485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rft9" event={"ID":"6916b8f7-3fe5-4412-b5b6-ecd63414667e","Type":"ContainerDied","Data":"2ef543194ab62255dd6698069c49c619e521ab3766e6cf10f7104704aff29732"} Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.451504 4747 scope.go:117] "RemoveContainer" containerID="ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.474406 4747 scope.go:117] "RemoveContainer" containerID="9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.487661 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9hwh\" (UniqueName: \"kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh\") pod \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.488889 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content\") pod \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.489004 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities\") pod \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\" (UID: \"6916b8f7-3fe5-4412-b5b6-ecd63414667e\") " Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.491659 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities" (OuterVolumeSpecName: "utilities") pod "6916b8f7-3fe5-4412-b5b6-ecd63414667e" (UID: "6916b8f7-3fe5-4412-b5b6-ecd63414667e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.493835 4747 scope.go:117] "RemoveContainer" containerID="b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.495683 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh" (OuterVolumeSpecName: "kube-api-access-q9hwh") pod "6916b8f7-3fe5-4412-b5b6-ecd63414667e" (UID: "6916b8f7-3fe5-4412-b5b6-ecd63414667e"). InnerVolumeSpecName "kube-api-access-q9hwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.547280 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6916b8f7-3fe5-4412-b5b6-ecd63414667e" (UID: "6916b8f7-3fe5-4412-b5b6-ecd63414667e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.564316 4747 scope.go:117] "RemoveContainer" containerID="ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300" Dec 05 22:01:39 crc kubenswrapper[4747]: E1205 22:01:39.564826 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300\": container with ID starting with ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300 not found: ID does not exist" containerID="ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.564887 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300"} err="failed to get container status \"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300\": rpc error: code = NotFound desc = could not find container \"ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300\": container with ID starting with ca985d2e97da48c9311386c286be882ac55946e61e7d8ee7ddeea371369af300 not found: ID does not exist" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.564920 4747 scope.go:117] "RemoveContainer" containerID="9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05" Dec 05 22:01:39 crc kubenswrapper[4747]: E1205 22:01:39.565318 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05\": container with ID starting with 9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05 not found: ID does not exist" containerID="9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.565346 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05"} err="failed to get container status \"9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05\": rpc error: code = NotFound desc = could not find container \"9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05\": container with ID starting with 9a5c7a4fabdad82fd526fb8107cf577158c3b645ed2012f7a6daab993bf15a05 not found: ID does not exist" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.565366 4747 scope.go:117] "RemoveContainer" containerID="b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b" Dec 05 22:01:39 crc kubenswrapper[4747]: E1205 22:01:39.565607 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b\": container with ID starting with b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b not found: ID does not exist" containerID="b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.565627 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b"} err="failed to get container status \"b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b\": rpc error: code = NotFound desc = could not find container \"b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b\": container with ID starting with b932db5fd95d33992302081cd2550392a2820eb14dbd4c29feb864bc4bb41a1b not found: ID does not exist" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.590033 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9hwh\" (UniqueName: \"kubernetes.io/projected/6916b8f7-3fe5-4412-b5b6-ecd63414667e-kube-api-access-q9hwh\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.590067 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.590080 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6916b8f7-3fe5-4412-b5b6-ecd63414667e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.795975 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.802959 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2rft9"] Dec 05 22:01:39 crc kubenswrapper[4747]: I1205 22:01:39.855832 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" path="/var/lib/kubelet/pods/6916b8f7-3fe5-4412-b5b6-ecd63414667e/volumes" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.299181 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jl7s4"] Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.309687 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jl7s4"] Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.423194 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9cc92"] Dec 05 22:02:01 crc kubenswrapper[4747]: E1205 22:02:01.423745 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="registry-server" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.423778 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="registry-server" Dec 05 22:02:01 crc kubenswrapper[4747]: E1205 22:02:01.423803 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="extract-utilities" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.423820 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="extract-utilities" Dec 05 22:02:01 crc kubenswrapper[4747]: E1205 22:02:01.423872 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="extract-content" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.423885 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="extract-content" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.424159 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6916b8f7-3fe5-4412-b5b6-ecd63414667e" containerName="registry-server" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.425002 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.428430 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.428437 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.429901 4747 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9vgs7" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.432145 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.442131 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9cc92"] Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.443925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.443987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjzq\" (UniqueName: \"kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.444136 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.544926 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.545026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.545052 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjzq\" (UniqueName: \"kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.545385 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.546162 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.569219 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjzq\" (UniqueName: \"kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq\") pod \"crc-storage-crc-9cc92\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.780883 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:01 crc kubenswrapper[4747]: I1205 22:02:01.866036 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd295402-3065-434f-893c-cea3109f6bad" path="/var/lib/kubelet/pods/bd295402-3065-434f-893c-cea3109f6bad/volumes" Dec 05 22:02:02 crc kubenswrapper[4747]: I1205 22:02:02.277109 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9cc92"] Dec 05 22:02:02 crc kubenswrapper[4747]: I1205 22:02:02.659005 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9cc92" event={"ID":"9a778915-ad9a-44af-b2b9-bc4b496f5aac","Type":"ContainerStarted","Data":"fc406b23fee141b1b96f3671fb1ada09b9e45f2f9961dba5a054c6392bb0643d"} Dec 05 22:02:03 crc kubenswrapper[4747]: I1205 22:02:03.668540 4747 generic.go:334] "Generic (PLEG): container finished" podID="9a778915-ad9a-44af-b2b9-bc4b496f5aac" containerID="7f3bc66e03b310691bf30ab718ddb49a687d78adcffda825bb1653b8ac7c51fe" exitCode=0 Dec 05 22:02:03 crc kubenswrapper[4747]: I1205 22:02:03.668677 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9cc92" event={"ID":"9a778915-ad9a-44af-b2b9-bc4b496f5aac","Type":"ContainerDied","Data":"7f3bc66e03b310691bf30ab718ddb49a687d78adcffda825bb1653b8ac7c51fe"} Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.059129 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.201248 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt\") pod \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.201403 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage\") pod \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.201404 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9a778915-ad9a-44af-b2b9-bc4b496f5aac" (UID: "9a778915-ad9a-44af-b2b9-bc4b496f5aac"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.201442 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjzq\" (UniqueName: \"kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq\") pod \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\" (UID: \"9a778915-ad9a-44af-b2b9-bc4b496f5aac\") " Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.202211 4747 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a778915-ad9a-44af-b2b9-bc4b496f5aac-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.207209 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq" (OuterVolumeSpecName: "kube-api-access-wcjzq") pod "9a778915-ad9a-44af-b2b9-bc4b496f5aac" (UID: "9a778915-ad9a-44af-b2b9-bc4b496f5aac"). InnerVolumeSpecName "kube-api-access-wcjzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.232090 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9a778915-ad9a-44af-b2b9-bc4b496f5aac" (UID: "9a778915-ad9a-44af-b2b9-bc4b496f5aac"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.303253 4747 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a778915-ad9a-44af-b2b9-bc4b496f5aac-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.303291 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcjzq\" (UniqueName: \"kubernetes.io/projected/9a778915-ad9a-44af-b2b9-bc4b496f5aac-kube-api-access-wcjzq\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.698807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9cc92" event={"ID":"9a778915-ad9a-44af-b2b9-bc4b496f5aac","Type":"ContainerDied","Data":"fc406b23fee141b1b96f3671fb1ada09b9e45f2f9961dba5a054c6392bb0643d"} Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.698899 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc406b23fee141b1b96f3671fb1ada09b9e45f2f9961dba5a054c6392bb0643d" Dec 05 22:02:05 crc kubenswrapper[4747]: I1205 22:02:05.699040 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9cc92" Dec 05 22:02:06 crc kubenswrapper[4747]: I1205 22:02:06.221871 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:02:06 crc kubenswrapper[4747]: I1205 22:02:06.221970 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.243982 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9cc92"] Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.253820 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9cc92"] Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.395218 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-f26xp"] Dec 05 22:02:07 crc kubenswrapper[4747]: E1205 22:02:07.396211 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a778915-ad9a-44af-b2b9-bc4b496f5aac" containerName="storage" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.396251 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a778915-ad9a-44af-b2b9-bc4b496f5aac" containerName="storage" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.396520 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a778915-ad9a-44af-b2b9-bc4b496f5aac" containerName="storage" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.397509 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.403955 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.404286 4747 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9vgs7" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.404542 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.405393 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.413435 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f26xp"] Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.542193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.542551 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.542782 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkg4\" (UniqueName: \"kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.643976 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkg4\" (UniqueName: \"kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.644107 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.644330 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.644822 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.645503 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.678725 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkg4\" (UniqueName: \"kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4\") pod \"crc-storage-crc-f26xp\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.732452 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:07 crc kubenswrapper[4747]: I1205 22:02:07.857539 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a778915-ad9a-44af-b2b9-bc4b496f5aac" path="/var/lib/kubelet/pods/9a778915-ad9a-44af-b2b9-bc4b496f5aac/volumes" Dec 05 22:02:08 crc kubenswrapper[4747]: I1205 22:02:08.225011 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f26xp"] Dec 05 22:02:08 crc kubenswrapper[4747]: I1205 22:02:08.726459 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f26xp" event={"ID":"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960","Type":"ContainerStarted","Data":"7cc54abce5baa8a280f34cefc576095790abdb3ef8ac402c50ba4963ddeee937"} Dec 05 22:02:09 crc kubenswrapper[4747]: I1205 22:02:09.736288 4747 generic.go:334] "Generic (PLEG): container finished" podID="a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" containerID="5eeff2893797d755592ed69e78711b735e7711ce85b6c96c8e7936eeb8fbb6cd" exitCode=0 Dec 05 22:02:09 crc kubenswrapper[4747]: I1205 22:02:09.736532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f26xp" event={"ID":"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960","Type":"ContainerDied","Data":"5eeff2893797d755592ed69e78711b735e7711ce85b6c96c8e7936eeb8fbb6cd"} Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.082643 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.200260 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt\") pod \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.200374 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage\") pod \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.200466 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkg4\" (UniqueName: \"kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4\") pod \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\" (UID: \"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960\") " Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.201063 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" (UID: "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.301774 4747 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.585061 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4" (OuterVolumeSpecName: "kube-api-access-ckkg4") pod "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" (UID: "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960"). InnerVolumeSpecName "kube-api-access-ckkg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.606425 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkg4\" (UniqueName: \"kubernetes.io/projected/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-kube-api-access-ckkg4\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.614995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" (UID: "a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.708128 4747 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.755358 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f26xp" event={"ID":"a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960","Type":"ContainerDied","Data":"7cc54abce5baa8a280f34cefc576095790abdb3ef8ac402c50ba4963ddeee937"} Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.755696 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc54abce5baa8a280f34cefc576095790abdb3ef8ac402c50ba4963ddeee937" Dec 05 22:02:11 crc kubenswrapper[4747]: I1205 22:02:11.755434 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f26xp" Dec 05 22:02:28 crc kubenswrapper[4747]: I1205 22:02:28.476769 4747 scope.go:117] "RemoveContainer" containerID="74c4da903deb0ea2982269fd4168103a3fe73acf94a8c74cffffde739a964b9b" Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.221880 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.222775 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.223040 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.224017 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.224149 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da" gracePeriod=600 Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.990506 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da" exitCode=0 Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.990638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da"} Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.990940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804"} Dec 05 22:02:36 crc kubenswrapper[4747]: I1205 22:02:36.990972 4747 scope.go:117] "RemoveContainer" containerID="d72ac19eb56482f7dc3dcad2c5ac6b96b1fab11dd07b3e39f184ba97eb51a150" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.702120 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:10 crc kubenswrapper[4747]: E1205 22:04:10.702929 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" containerName="storage" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.702943 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" containerName="storage" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.703085 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17dd9a9-78d3-4ed2-bf84-36bd3fcd4960" containerName="storage" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.703813 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.705951 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.706089 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.706133 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.709235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6ljv5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.715375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.719919 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-n7279"] Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.721051 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.726256 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.744820 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-n7279"] Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.862646 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt88f\" (UniqueName: \"kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.862700 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.862733 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.862829 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbzx\" (UniqueName: \"kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.862876 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.896751 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-n7279"] Dec 05 22:04:10 crc kubenswrapper[4747]: E1205 22:04:10.897311 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-fhbzx], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-599f5d6f75-n7279" podUID="b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.920816 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.921993 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.938191 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.963545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbzx\" (UniqueName: \"kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.963594 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.963774 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt88f\" (UniqueName: \"kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.963807 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.963841 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.964770 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.964798 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.964877 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.988683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt88f\" (UniqueName: \"kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f\") pod \"dnsmasq-dns-7f5d88f885-8jvt5\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:10 crc kubenswrapper[4747]: I1205 22:04:10.991248 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbzx\" (UniqueName: \"kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx\") pod \"dnsmasq-dns-599f5d6f75-n7279\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.019766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.065567 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.065670 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.065726 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gpj\" (UniqueName: \"kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.167195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.167256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.167295 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gpj\" (UniqueName: \"kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.168104 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.168104 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.187732 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gpj\" (UniqueName: \"kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj\") pod \"dnsmasq-dns-77f9bd84cc-k24gw\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.246888 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.311483 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.363828 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.364951 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.368530 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.473053 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.473271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.473362 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpjt\" (UniqueName: \"kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.574467 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.574534 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpjt\" (UniqueName: \"kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.574722 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.575627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.575677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.602604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpjt\" (UniqueName: \"kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt\") pod \"dnsmasq-dns-7cbb4f659c-nrcbk\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.694684 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.740844 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.850848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.904725 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" event={"ID":"7ce5c33f-5c2e-43ce-b062-5914a9cd134b","Type":"ContainerStarted","Data":"96cb61c0e41a49beffe2784b68f550162193fe6f496c48178091a6906195b062"} Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.914242 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:11 crc kubenswrapper[4747]: I1205 22:04:11.952376 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:12 crc kubenswrapper[4747]: W1205 22:04:12.027283 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6e3199_d622_45a9_b9be_e096304f29ca.slice/crio-01e57743e772bd3c8e46f28e22e6102b1c1383c15fb897eafc385b61f355ca3a WatchSource:0}: Error finding container 01e57743e772bd3c8e46f28e22e6102b1c1383c15fb897eafc385b61f355ca3a: Status 404 returned error can't find the container with id 01e57743e772bd3c8e46f28e22e6102b1c1383c15fb897eafc385b61f355ca3a Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.075534 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.077668 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082160 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082242 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082374 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082374 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082445 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f444s" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.082524 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.086667 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.102677 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.106865 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config\") pod \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.107030 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc\") pod \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.107058 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhbzx\" (UniqueName: \"kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx\") pod \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\" (UID: \"b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5\") " Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.108167 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config" (OuterVolumeSpecName: "config") pod "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5" (UID: "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.108470 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5" (UID: "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.111238 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx" (OuterVolumeSpecName: "kube-api-access-fhbzx") pod "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5" (UID: "b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5"). InnerVolumeSpecName "kube-api-access-fhbzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.154360 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:04:12 crc kubenswrapper[4747]: W1205 22:04:12.154994 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a80203e_563e_46d6_a322_df8bdf64d509.slice/crio-5d4fccd961997f5fbaae3925ade8c16ea8a80fe6ac62ff17135d8433f0aaf4c3 WatchSource:0}: Error finding container 5d4fccd961997f5fbaae3925ade8c16ea8a80fe6ac62ff17135d8433f0aaf4c3: Status 404 returned error can't find the container with id 5d4fccd961997f5fbaae3925ade8c16ea8a80fe6ac62ff17135d8433f0aaf4c3 Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208412 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208445 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zmf\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208618 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208687 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208722 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208802 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.208916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.209132 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.209147 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhbzx\" (UniqueName: \"kubernetes.io/projected/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-kube-api-access-fhbzx\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.209160 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310318 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310415 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zmf\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310457 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310488 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310520 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.310543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.311313 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.311556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.311834 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.312225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.312569 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.315184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.315203 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.315184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.315894 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.316660 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.316696 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e16f1934d34a5a3a7d98064185faecfe11b1c68091895ad6c06ceaca331876de/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.332605 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zmf\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.346351 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.432484 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.433660 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.435656 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.435675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.437094 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.438001 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.439798 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.439918 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9tfbg" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.441217 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.450902 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.500118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614796 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614837 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614860 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtw55\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614875 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614937 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614960 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.614989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.615007 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.615047 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.615070 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716074 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.716927 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.717266 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718014 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtw55\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718073 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718148 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718191 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718283 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718692 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718801 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.718968 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.721986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.722056 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.722282 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.722321 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3474c9400e053623cf942402e2ec88ace496279d93ff7b8788f812481327808/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.723727 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.727190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.733170 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtw55\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.793551 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " pod="openstack/rabbitmq-server-0" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.873965 4747 generic.go:334] "Generic (PLEG): container finished" podID="6c6e3199-d622-45a9-b9be-e096304f29ca" containerID="1e6e1a0f0f515ac43d90ef2ea3d52ef1377a45eb1feca3241e5fc8ac802d93b5" exitCode=0 Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.874237 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" event={"ID":"6c6e3199-d622-45a9-b9be-e096304f29ca","Type":"ContainerDied","Data":"1e6e1a0f0f515ac43d90ef2ea3d52ef1377a45eb1feca3241e5fc8ac802d93b5"} Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.874275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" event={"ID":"6c6e3199-d622-45a9-b9be-e096304f29ca","Type":"ContainerStarted","Data":"01e57743e772bd3c8e46f28e22e6102b1c1383c15fb897eafc385b61f355ca3a"} Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.876949 4747 generic.go:334] "Generic (PLEG): container finished" podID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerID="3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51" exitCode=0 Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.877003 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" event={"ID":"7ce5c33f-5c2e-43ce-b062-5914a9cd134b","Type":"ContainerDied","Data":"3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51"} Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.880418 4747 generic.go:334] "Generic (PLEG): container finished" podID="4a80203e-563e-46d6-a322-df8bdf64d509" containerID="4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3" exitCode=0 Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.880516 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-n7279" Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.881394 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" event={"ID":"4a80203e-563e-46d6-a322-df8bdf64d509","Type":"ContainerDied","Data":"4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3"} Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.881481 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" event={"ID":"4a80203e-563e-46d6-a322-df8bdf64d509","Type":"ContainerStarted","Data":"5d4fccd961997f5fbaae3925ade8c16ea8a80fe6ac62ff17135d8433f0aaf4c3"} Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.977432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.983272 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-n7279"] Dec 05 22:04:12 crc kubenswrapper[4747]: I1205 22:04:12.987759 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-n7279"] Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.049869 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.220988 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.329360 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config\") pod \"6c6e3199-d622-45a9-b9be-e096304f29ca\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.329440 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc\") pod \"6c6e3199-d622-45a9-b9be-e096304f29ca\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.329517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5gpj\" (UniqueName: \"kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj\") pod \"6c6e3199-d622-45a9-b9be-e096304f29ca\" (UID: \"6c6e3199-d622-45a9-b9be-e096304f29ca\") " Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.334332 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj" (OuterVolumeSpecName: "kube-api-access-j5gpj") pod "6c6e3199-d622-45a9-b9be-e096304f29ca" (UID: "6c6e3199-d622-45a9-b9be-e096304f29ca"). InnerVolumeSpecName "kube-api-access-j5gpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.356190 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c6e3199-d622-45a9-b9be-e096304f29ca" (UID: "6c6e3199-d622-45a9-b9be-e096304f29ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.359057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config" (OuterVolumeSpecName: "config") pod "6c6e3199-d622-45a9-b9be-e096304f29ca" (UID: "6c6e3199-d622-45a9-b9be-e096304f29ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.431527 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.431835 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6e3199-d622-45a9-b9be-e096304f29ca-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.431925 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5gpj\" (UniqueName: \"kubernetes.io/projected/6c6e3199-d622-45a9-b9be-e096304f29ca-kube-api-access-j5gpj\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.571257 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:04:13 crc kubenswrapper[4747]: W1205 22:04:13.571679 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe84605_dd30_4d36_80d8_c9a11f25c186.slice/crio-7a15d5c97f226e51dca044dfa144d248e5d7f56ee91607ce01a122b1fb908e2f WatchSource:0}: Error finding container 7a15d5c97f226e51dca044dfa144d248e5d7f56ee91607ce01a122b1fb908e2f: Status 404 returned error can't find the container with id 7a15d5c97f226e51dca044dfa144d248e5d7f56ee91607ce01a122b1fb908e2f Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.650191 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 22:04:13 crc kubenswrapper[4747]: E1205 22:04:13.650654 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6e3199-d622-45a9-b9be-e096304f29ca" containerName="init" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.650689 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6e3199-d622-45a9-b9be-e096304f29ca" containerName="init" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.650870 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6e3199-d622-45a9-b9be-e096304f29ca" containerName="init" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.652173 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.654723 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.654772 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.655235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.655792 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hg62n" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.658851 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.662319 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.839537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.840804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.840986 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.841118 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.841326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.841444 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.841568 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.841696 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfh9t\" (UniqueName: \"kubernetes.io/projected/8b0a2255-16ff-4be0-a16e-161076f32fc4-kube-api-access-wfh9t\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.847885 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5" path="/var/lib/kubelet/pods/b7b4c5b3-7ce1-497b-b3c0-9ce4da0d5ac5/volumes" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.888381 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.888372 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9bd84cc-k24gw" event={"ID":"6c6e3199-d622-45a9-b9be-e096304f29ca","Type":"ContainerDied","Data":"01e57743e772bd3c8e46f28e22e6102b1c1383c15fb897eafc385b61f355ca3a"} Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.888548 4747 scope.go:117] "RemoveContainer" containerID="1e6e1a0f0f515ac43d90ef2ea3d52ef1377a45eb1feca3241e5fc8ac802d93b5" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.889741 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerStarted","Data":"7a15d5c97f226e51dca044dfa144d248e5d7f56ee91607ce01a122b1fb908e2f"} Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.893153 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" event={"ID":"7ce5c33f-5c2e-43ce-b062-5914a9cd134b","Type":"ContainerStarted","Data":"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3"} Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.893846 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.896276 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" event={"ID":"4a80203e-563e-46d6-a322-df8bdf64d509","Type":"ContainerStarted","Data":"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f"} Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.896632 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.897345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerStarted","Data":"8ac4800e0be9c9f00aecff668f206609c8a45ccb35c0c54f86f31ed5956b9734"} Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.933480 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.939809 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f9bd84cc-k24gw"] Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.942427 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" podStartSLOduration=3.9424099679999998 podStartE2EDuration="3.942409968s" podCreationTimestamp="2025-12-05 22:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:13.938172442 +0000 UTC m=+4924.405479940" watchObservedRunningTime="2025-12-05 22:04:13.942409968 +0000 UTC m=+4924.409717446" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944060 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944144 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944207 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfh9t\" (UniqueName: \"kubernetes.io/projected/8b0a2255-16ff-4be0-a16e-161076f32fc4-kube-api-access-wfh9t\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944282 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.944303 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.945407 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.945901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b0a2255-16ff-4be0-a16e-161076f32fc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.945941 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.946472 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b0a2255-16ff-4be0-a16e-161076f32fc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.949160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.951097 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a2255-16ff-4be0-a16e-161076f32fc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.958752 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.958801 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/231444a6c86fd504147e02c98ddcca248771e28bcdaca34a7be2b8dbe81a31a1/globalmount\"" pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.964715 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" podStartSLOduration=2.964679564 podStartE2EDuration="2.964679564s" podCreationTimestamp="2025-12-05 22:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:13.964259554 +0000 UTC m=+4924.431567042" watchObservedRunningTime="2025-12-05 22:04:13.964679564 +0000 UTC m=+4924.431987052" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.970345 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfh9t\" (UniqueName: \"kubernetes.io/projected/8b0a2255-16ff-4be0-a16e-161076f32fc4-kube-api-access-wfh9t\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:13 crc kubenswrapper[4747]: I1205 22:04:13.998987 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e58c24a-b8bf-4daf-95f2-ebba4ca34c0c\") pod \"openstack-galera-0\" (UID: \"8b0a2255-16ff-4be0-a16e-161076f32fc4\") " pod="openstack/openstack-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.294535 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.823898 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.825457 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.828200 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-skv9c" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.828273 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.828946 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.829092 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.838827 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.900730 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.905062 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerStarted","Data":"1c7baecd6e66073598df9799197e696fc05bec859704b5f7b6a8a261174be8a7"} Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.906471 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerStarted","Data":"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b"} Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964190 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964350 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964376 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-819f280c-cd51-475b-8231-76add3f0484b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-819f280c-cd51-475b-8231-76add3f0484b\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964469 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ns94\" (UniqueName: \"kubernetes.io/projected/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kube-api-access-7ns94\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:14 crc kubenswrapper[4747]: I1205 22:04:14.964535 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.065738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.065978 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-819f280c-cd51-475b-8231-76add3f0484b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-819f280c-cd51-475b-8231-76add3f0484b\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066076 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ns94\" (UniqueName: \"kubernetes.io/projected/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kube-api-access-7ns94\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066109 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066214 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.066409 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.069030 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.069389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.069896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.071441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e2599a-2280-4bf7-a62c-60fcf447e74d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.077328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.077334 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e2599a-2280-4bf7-a62c-60fcf447e74d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.078445 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.078481 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-819f280c-cd51-475b-8231-76add3f0484b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-819f280c-cd51-475b-8231-76add3f0484b\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b48136576bfbcc114a8d70905c2ad9948e1bc44945558831deee05ba85e30c43/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.087881 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ns94\" (UniqueName: \"kubernetes.io/projected/f7e2599a-2280-4bf7-a62c-60fcf447e74d-kube-api-access-7ns94\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.138199 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-819f280c-cd51-475b-8231-76add3f0484b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-819f280c-cd51-475b-8231-76add3f0484b\") pod \"openstack-cell1-galera-0\" (UID: \"f7e2599a-2280-4bf7-a62c-60fcf447e74d\") " pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.146485 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.272832 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.274321 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.278748 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5w6r8" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.278874 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.278930 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.284927 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.370281 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kolla-config\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.370336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.370377 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.370398 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6rn\" (UniqueName: \"kubernetes.io/projected/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kube-api-access-7l6rn\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.370421 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-config-data\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.472430 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.472510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.472535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6rn\" (UniqueName: \"kubernetes.io/projected/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kube-api-access-7l6rn\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.472561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-config-data\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.472637 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kolla-config\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.473248 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kolla-config\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.473504 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72f13c5c-74d7-4d4c-8518-8bc6c539170b-config-data\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.477273 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.477514 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f13c5c-74d7-4d4c-8518-8bc6c539170b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.491449 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6rn\" (UniqueName: \"kubernetes.io/projected/72f13c5c-74d7-4d4c-8518-8bc6c539170b-kube-api-access-7l6rn\") pod \"memcached-0\" (UID: \"72f13c5c-74d7-4d4c-8518-8bc6c539170b\") " pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.597678 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.643275 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.846987 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6e3199-d622-45a9-b9be-e096304f29ca" path="/var/lib/kubelet/pods/6c6e3199-d622-45a9-b9be-e096304f29ca/volumes" Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.915863 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b0a2255-16ff-4be0-a16e-161076f32fc4","Type":"ContainerStarted","Data":"a1699caedb474ae8a5ce95fd2e9f2c7926b0f25b66ab2cf45fd51665742c765a"} Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.916024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b0a2255-16ff-4be0-a16e-161076f32fc4","Type":"ContainerStarted","Data":"b5b7ac9280f4ab87f6d7063993085cda9f38ab41a742595a980c5d80507b121c"} Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.917402 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7e2599a-2280-4bf7-a62c-60fcf447e74d","Type":"ContainerStarted","Data":"c46799d72727614ef981b13fdda8c10a90a70cb6d9fb583a13e374499307c546"} Dec 05 22:04:15 crc kubenswrapper[4747]: I1205 22:04:15.917421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7e2599a-2280-4bf7-a62c-60fcf447e74d","Type":"ContainerStarted","Data":"714fe77bf42584948fd2e389396f057ddda341d4a774ce7dad207caf6932e58d"} Dec 05 22:04:16 crc kubenswrapper[4747]: I1205 22:04:16.026546 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 22:04:16 crc kubenswrapper[4747]: W1205 22:04:16.282778 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72f13c5c_74d7_4d4c_8518_8bc6c539170b.slice/crio-ddedd5ad4149d87e384b9b1d7cb2aadcdc1036af3ba6aeb58ac5579763391a1f WatchSource:0}: Error finding container ddedd5ad4149d87e384b9b1d7cb2aadcdc1036af3ba6aeb58ac5579763391a1f: Status 404 returned error can't find the container with id ddedd5ad4149d87e384b9b1d7cb2aadcdc1036af3ba6aeb58ac5579763391a1f Dec 05 22:04:16 crc kubenswrapper[4747]: I1205 22:04:16.925154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"72f13c5c-74d7-4d4c-8518-8bc6c539170b","Type":"ContainerStarted","Data":"f0c648b68facb998fc0ed84d859300d0b392d1ee01606a4e4af0203f65e3be9e"} Dec 05 22:04:16 crc kubenswrapper[4747]: I1205 22:04:16.925546 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"72f13c5c-74d7-4d4c-8518-8bc6c539170b","Type":"ContainerStarted","Data":"ddedd5ad4149d87e384b9b1d7cb2aadcdc1036af3ba6aeb58ac5579763391a1f"} Dec 05 22:04:16 crc kubenswrapper[4747]: I1205 22:04:16.946155 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.946134011 podStartE2EDuration="1.946134011s" podCreationTimestamp="2025-12-05 22:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:16.938664874 +0000 UTC m=+4927.405972372" watchObservedRunningTime="2025-12-05 22:04:16.946134011 +0000 UTC m=+4927.413441499" Dec 05 22:04:17 crc kubenswrapper[4747]: I1205 22:04:17.932126 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 22:04:19 crc kubenswrapper[4747]: I1205 22:04:19.951494 4747 generic.go:334] "Generic (PLEG): container finished" podID="8b0a2255-16ff-4be0-a16e-161076f32fc4" containerID="a1699caedb474ae8a5ce95fd2e9f2c7926b0f25b66ab2cf45fd51665742c765a" exitCode=0 Dec 05 22:04:19 crc kubenswrapper[4747]: I1205 22:04:19.951614 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b0a2255-16ff-4be0-a16e-161076f32fc4","Type":"ContainerDied","Data":"a1699caedb474ae8a5ce95fd2e9f2c7926b0f25b66ab2cf45fd51665742c765a"} Dec 05 22:04:20 crc kubenswrapper[4747]: I1205 22:04:20.961338 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b0a2255-16ff-4be0-a16e-161076f32fc4","Type":"ContainerStarted","Data":"e988d7358c3ff8e808b37837e270cbd4352b6cfe13255b05ddb7cb18e550812c"} Dec 05 22:04:20 crc kubenswrapper[4747]: I1205 22:04:20.963992 4747 generic.go:334] "Generic (PLEG): container finished" podID="f7e2599a-2280-4bf7-a62c-60fcf447e74d" containerID="c46799d72727614ef981b13fdda8c10a90a70cb6d9fb583a13e374499307c546" exitCode=0 Dec 05 22:04:20 crc kubenswrapper[4747]: I1205 22:04:20.964038 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7e2599a-2280-4bf7-a62c-60fcf447e74d","Type":"ContainerDied","Data":"c46799d72727614ef981b13fdda8c10a90a70cb6d9fb583a13e374499307c546"} Dec 05 22:04:20 crc kubenswrapper[4747]: I1205 22:04:20.994890 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.994860334 podStartE2EDuration="8.994860334s" podCreationTimestamp="2025-12-05 22:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:20.988895935 +0000 UTC m=+4931.456203423" watchObservedRunningTime="2025-12-05 22:04:20.994860334 +0000 UTC m=+4931.462167862" Dec 05 22:04:21 crc kubenswrapper[4747]: I1205 22:04:21.021854 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:21 crc kubenswrapper[4747]: I1205 22:04:21.696669 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:04:21 crc kubenswrapper[4747]: I1205 22:04:21.770089 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:21 crc kubenswrapper[4747]: I1205 22:04:21.971243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7e2599a-2280-4bf7-a62c-60fcf447e74d","Type":"ContainerStarted","Data":"3a0f2295fad52f85144f153944b230a85e19fd740971beeb5a6ce5466b552879"} Dec 05 22:04:21 crc kubenswrapper[4747]: I1205 22:04:21.971380 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="dnsmasq-dns" containerID="cri-o://01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3" gracePeriod=10 Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.005508 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.005489557 podStartE2EDuration="9.005489557s" podCreationTimestamp="2025-12-05 22:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:22.003825555 +0000 UTC m=+4932.471133053" watchObservedRunningTime="2025-12-05 22:04:22.005489557 +0000 UTC m=+4932.472797045" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.455353 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.584635 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt88f\" (UniqueName: \"kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f\") pod \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.584698 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config\") pod \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\" (UID: \"7ce5c33f-5c2e-43ce-b062-5914a9cd134b\") " Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.591010 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f" (OuterVolumeSpecName: "kube-api-access-gt88f") pod "7ce5c33f-5c2e-43ce-b062-5914a9cd134b" (UID: "7ce5c33f-5c2e-43ce-b062-5914a9cd134b"). InnerVolumeSpecName "kube-api-access-gt88f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.616339 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config" (OuterVolumeSpecName: "config") pod "7ce5c33f-5c2e-43ce-b062-5914a9cd134b" (UID: "7ce5c33f-5c2e-43ce-b062-5914a9cd134b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.686459 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt88f\" (UniqueName: \"kubernetes.io/projected/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-kube-api-access-gt88f\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.686493 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce5c33f-5c2e-43ce-b062-5914a9cd134b-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.983167 4747 generic.go:334] "Generic (PLEG): container finished" podID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerID="01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3" exitCode=0 Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.983241 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" event={"ID":"7ce5c33f-5c2e-43ce-b062-5914a9cd134b","Type":"ContainerDied","Data":"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3"} Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.983261 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.983290 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-8jvt5" event={"ID":"7ce5c33f-5c2e-43ce-b062-5914a9cd134b","Type":"ContainerDied","Data":"96cb61c0e41a49beffe2784b68f550162193fe6f496c48178091a6906195b062"} Dec 05 22:04:22 crc kubenswrapper[4747]: I1205 22:04:22.983353 4747 scope.go:117] "RemoveContainer" containerID="01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.005232 4747 scope.go:117] "RemoveContainer" containerID="3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.048868 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.052724 4747 scope.go:117] "RemoveContainer" containerID="01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3" Dec 05 22:04:23 crc kubenswrapper[4747]: E1205 22:04:23.053276 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3\": container with ID starting with 01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3 not found: ID does not exist" containerID="01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.053350 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3"} err="failed to get container status \"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3\": rpc error: code = NotFound desc = could not find container \"01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3\": container with ID starting with 01eb0133d19b1d1fe35801e675f14e56baa2993e0848539b935c82f7727e68c3 not found: ID does not exist" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.053385 4747 scope.go:117] "RemoveContainer" containerID="3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51" Dec 05 22:04:23 crc kubenswrapper[4747]: E1205 22:04:23.053808 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51\": container with ID starting with 3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51 not found: ID does not exist" containerID="3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.053843 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51"} err="failed to get container status \"3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51\": rpc error: code = NotFound desc = could not find container \"3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51\": container with ID starting with 3e00c6ff4dbb812540d77828f9e0237c95fe23c5f99e6d1bedd3a91f0924af51 not found: ID does not exist" Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.056107 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-8jvt5"] Dec 05 22:04:23 crc kubenswrapper[4747]: I1205 22:04:23.854943 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" path="/var/lib/kubelet/pods/7ce5c33f-5c2e-43ce-b062-5914a9cd134b/volumes" Dec 05 22:04:24 crc kubenswrapper[4747]: I1205 22:04:24.294865 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 22:04:24 crc kubenswrapper[4747]: I1205 22:04:24.294973 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 22:04:25 crc kubenswrapper[4747]: I1205 22:04:25.039535 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 22:04:25 crc kubenswrapper[4747]: I1205 22:04:25.107893 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 22:04:25 crc kubenswrapper[4747]: I1205 22:04:25.147230 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:25 crc kubenswrapper[4747]: I1205 22:04:25.147295 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:25 crc kubenswrapper[4747]: I1205 22:04:25.598948 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 22:04:27 crc kubenswrapper[4747]: I1205 22:04:27.336333 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:27 crc kubenswrapper[4747]: I1205 22:04:27.415698 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 22:04:36 crc kubenswrapper[4747]: I1205 22:04:36.222262 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:04:36 crc kubenswrapper[4747]: I1205 22:04:36.223141 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:04:47 crc kubenswrapper[4747]: I1205 22:04:47.216372 4747 generic.go:334] "Generic (PLEG): container finished" podID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerID="1c7baecd6e66073598df9799197e696fc05bec859704b5f7b6a8a261174be8a7" exitCode=0 Dec 05 22:04:47 crc kubenswrapper[4747]: I1205 22:04:47.216627 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerDied","Data":"1c7baecd6e66073598df9799197e696fc05bec859704b5f7b6a8a261174be8a7"} Dec 05 22:04:47 crc kubenswrapper[4747]: I1205 22:04:47.220182 4747 generic.go:334] "Generic (PLEG): container finished" podID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerID="ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b" exitCode=0 Dec 05 22:04:47 crc kubenswrapper[4747]: I1205 22:04:47.220254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerDied","Data":"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b"} Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.243652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerStarted","Data":"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89"} Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.245413 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.246883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerStarted","Data":"c6be803311a69c2d65e995b757337697144af4a7db42f720de3fad2e222b409b"} Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.247362 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.285068 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.285048362 podStartE2EDuration="37.285048362s" podCreationTimestamp="2025-12-05 22:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:48.283459363 +0000 UTC m=+4958.750766921" watchObservedRunningTime="2025-12-05 22:04:48.285048362 +0000 UTC m=+4958.752355880" Dec 05 22:04:48 crc kubenswrapper[4747]: I1205 22:04:48.310545 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.310513578 podStartE2EDuration="37.310513578s" podCreationTimestamp="2025-12-05 22:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:04:48.304516759 +0000 UTC m=+4958.771824277" watchObservedRunningTime="2025-12-05 22:04:48.310513578 +0000 UTC m=+4958.777821106" Dec 05 22:05:02 crc kubenswrapper[4747]: I1205 22:05:02.504886 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:03 crc kubenswrapper[4747]: I1205 22:05:03.055820 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 22:05:06 crc kubenswrapper[4747]: I1205 22:05:06.222152 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:05:06 crc kubenswrapper[4747]: I1205 22:05:06.222529 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.179382 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:05:07 crc kubenswrapper[4747]: E1205 22:05:07.180054 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="init" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.180077 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="init" Dec 05 22:05:07 crc kubenswrapper[4747]: E1205 22:05:07.180113 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="dnsmasq-dns" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.180122 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="dnsmasq-dns" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.180295 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce5c33f-5c2e-43ce-b062-5914a9cd134b" containerName="dnsmasq-dns" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.181545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.191144 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.239504 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkw9x\" (UniqueName: \"kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.239664 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.239729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.341140 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkw9x\" (UniqueName: \"kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.341246 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.341307 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.342309 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.342329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.366367 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkw9x\" (UniqueName: \"kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x\") pod \"dnsmasq-dns-f79bf7859-sd6tf\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.499836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.849999 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:07 crc kubenswrapper[4747]: I1205 22:05:07.979728 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:05:08 crc kubenswrapper[4747]: I1205 22:05:08.421269 4747 generic.go:334] "Generic (PLEG): container finished" podID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerID="cf7c138ad99b436ad40dc321c3992dba39134f080ac8549124369f8b3d39d4df" exitCode=0 Dec 05 22:05:08 crc kubenswrapper[4747]: I1205 22:05:08.421318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" event={"ID":"7be5844e-65a8-4a1a-b453-31fe2bae1a6c","Type":"ContainerDied","Data":"cf7c138ad99b436ad40dc321c3992dba39134f080ac8549124369f8b3d39d4df"} Dec 05 22:05:08 crc kubenswrapper[4747]: I1205 22:05:08.421602 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" event={"ID":"7be5844e-65a8-4a1a-b453-31fe2bae1a6c","Type":"ContainerStarted","Data":"f4e34ee706954c917f328ef97755c3f31610cbcd3b92a7f243ebfed8cd49e9ff"} Dec 05 22:05:08 crc kubenswrapper[4747]: I1205 22:05:08.541192 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:09 crc kubenswrapper[4747]: I1205 22:05:09.428844 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" event={"ID":"7be5844e-65a8-4a1a-b453-31fe2bae1a6c","Type":"ContainerStarted","Data":"aad08731ff35ef3e9d19f6fe4363906dd55bdd048bee9ce9710d472b8cd7bf6d"} Dec 05 22:05:09 crc kubenswrapper[4747]: I1205 22:05:09.429000 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:11 crc kubenswrapper[4747]: I1205 22:05:11.890205 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="rabbitmq" containerID="cri-o://c6be803311a69c2d65e995b757337697144af4a7db42f720de3fad2e222b409b" gracePeriod=604796 Dec 05 22:05:12 crc kubenswrapper[4747]: I1205 22:05:12.379745 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="rabbitmq" containerID="cri-o://9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89" gracePeriod=604797 Dec 05 22:05:12 crc kubenswrapper[4747]: I1205 22:05:12.501243 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5671: connect: connection refused" Dec 05 22:05:13 crc kubenswrapper[4747]: I1205 22:05:13.053292 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5671: connect: connection refused" Dec 05 22:05:17 crc kubenswrapper[4747]: I1205 22:05:17.502153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:05:17 crc kubenswrapper[4747]: I1205 22:05:17.532271 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" podStartSLOduration=10.532248069 podStartE2EDuration="10.532248069s" podCreationTimestamp="2025-12-05 22:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:05:09.452875283 +0000 UTC m=+4979.920182781" watchObservedRunningTime="2025-12-05 22:05:17.532248069 +0000 UTC m=+4987.999555587" Dec 05 22:05:17 crc kubenswrapper[4747]: I1205 22:05:17.586657 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:05:17 crc kubenswrapper[4747]: I1205 22:05:17.586934 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="dnsmasq-dns" containerID="cri-o://257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f" gracePeriod=10 Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.352420 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.477816 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config\") pod \"4a80203e-563e-46d6-a322-df8bdf64d509\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.478062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc\") pod \"4a80203e-563e-46d6-a322-df8bdf64d509\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.478141 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvpjt\" (UniqueName: \"kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt\") pod \"4a80203e-563e-46d6-a322-df8bdf64d509\" (UID: \"4a80203e-563e-46d6-a322-df8bdf64d509\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.484317 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt" (OuterVolumeSpecName: "kube-api-access-dvpjt") pod "4a80203e-563e-46d6-a322-df8bdf64d509" (UID: "4a80203e-563e-46d6-a322-df8bdf64d509"). InnerVolumeSpecName "kube-api-access-dvpjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.511377 4747 generic.go:334] "Generic (PLEG): container finished" podID="4a80203e-563e-46d6-a322-df8bdf64d509" containerID="257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f" exitCode=0 Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.511466 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" event={"ID":"4a80203e-563e-46d6-a322-df8bdf64d509","Type":"ContainerDied","Data":"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f"} Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.511506 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" event={"ID":"4a80203e-563e-46d6-a322-df8bdf64d509","Type":"ContainerDied","Data":"5d4fccd961997f5fbaae3925ade8c16ea8a80fe6ac62ff17135d8433f0aaf4c3"} Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.511537 4747 scope.go:117] "RemoveContainer" containerID="257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.511789 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-nrcbk" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.518334 4747 generic.go:334] "Generic (PLEG): container finished" podID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerID="c6be803311a69c2d65e995b757337697144af4a7db42f720de3fad2e222b409b" exitCode=0 Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.518381 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerDied","Data":"c6be803311a69c2d65e995b757337697144af4a7db42f720de3fad2e222b409b"} Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.525111 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a80203e-563e-46d6-a322-df8bdf64d509" (UID: "4a80203e-563e-46d6-a322-df8bdf64d509"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.533550 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config" (OuterVolumeSpecName: "config") pod "4a80203e-563e-46d6-a322-df8bdf64d509" (UID: "4a80203e-563e-46d6-a322-df8bdf64d509"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.579781 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.579821 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvpjt\" (UniqueName: \"kubernetes.io/projected/4a80203e-563e-46d6-a322-df8bdf64d509-kube-api-access-dvpjt\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.579840 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a80203e-563e-46d6-a322-df8bdf64d509-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.596146 4747 scope.go:117] "RemoveContainer" containerID="4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.603478 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681104 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtw55\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681195 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681369 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681395 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681438 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681468 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681601 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681624 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681644 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.681690 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie\") pod \"cbe84605-dd30-4d36-80d8-c9a11f25c186\" (UID: \"cbe84605-dd30-4d36-80d8-c9a11f25c186\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.682958 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.685055 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.687315 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.689138 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.689932 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info" (OuterVolumeSpecName: "pod-info") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.692812 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55" (OuterVolumeSpecName: "kube-api-access-wtw55") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "kube-api-access-wtw55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.698437 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.704676 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31" (OuterVolumeSpecName: "persistence") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.705209 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data" (OuterVolumeSpecName: "config-data") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.723133 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf" (OuterVolumeSpecName: "server-conf") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.724996 4747 scope.go:117] "RemoveContainer" containerID="257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f" Dec 05 22:05:18 crc kubenswrapper[4747]: E1205 22:05:18.725393 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f\": container with ID starting with 257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f not found: ID does not exist" containerID="257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.725462 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f"} err="failed to get container status \"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f\": rpc error: code = NotFound desc = could not find container \"257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f\": container with ID starting with 257066dd46cebb30590997015920d6146fd61bb0dd5ef138531b28abd41c8b0f not found: ID does not exist" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.725487 4747 scope.go:117] "RemoveContainer" containerID="4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3" Dec 05 22:05:18 crc kubenswrapper[4747]: E1205 22:05:18.726274 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3\": container with ID starting with 4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3 not found: ID does not exist" containerID="4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.726337 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3"} err="failed to get container status \"4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3\": rpc error: code = NotFound desc = could not find container \"4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3\": container with ID starting with 4411755703fc1bef77883af0256cd5f00d37a707b66dafb2f63e2c0f70d1abf3 not found: ID does not exist" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.777360 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cbe84605-dd30-4d36-80d8-c9a11f25c186" (UID: "cbe84605-dd30-4d36-80d8-c9a11f25c186"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785748 4747 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785798 4747 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785810 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785824 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785838 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtw55\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-kube-api-access-wtw55\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785847 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe84605-dd30-4d36-80d8-c9a11f25c186-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785881 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") on node \"crc\" " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785891 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785900 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbe84605-dd30-4d36-80d8-c9a11f25c186-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785908 4747 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbe84605-dd30-4d36-80d8-c9a11f25c186-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.785915 4747 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbe84605-dd30-4d36-80d8-c9a11f25c186-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.802698 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.802844 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31") on node "crc" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.865734 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.871332 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-nrcbk"] Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.887301 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.911508 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.988786 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.988872 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.988955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989004 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989077 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989113 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zmf\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989163 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989325 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989358 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989405 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.989439 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf\") pod \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\" (UID: \"f2f8445b-d0bc-4130-8e3d-234a418c3fa5\") " Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.994293 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.995068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.995177 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:05:18 crc kubenswrapper[4747]: I1205 22:05:18.995227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.004178 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6" (OuterVolumeSpecName: "persistence") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.006738 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf" (OuterVolumeSpecName: "kube-api-access-l5zmf") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "kube-api-access-l5zmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.008812 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.009689 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info" (OuterVolumeSpecName: "pod-info") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.012264 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data" (OuterVolumeSpecName: "config-data") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.035390 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf" (OuterVolumeSpecName: "server-conf") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091740 4747 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091773 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091783 4747 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091792 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091802 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zmf\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-kube-api-access-l5zmf\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091809 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091838 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") on node \"crc\" " Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091849 4747 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091858 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.091866 4747 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.493138 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f2f8445b-d0bc-4130-8e3d-234a418c3fa5" (UID: "f2f8445b-d0bc-4130-8e3d-234a418c3fa5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.498051 4747 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f8445b-d0bc-4130-8e3d-234a418c3fa5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.532142 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbe84605-dd30-4d36-80d8-c9a11f25c186","Type":"ContainerDied","Data":"7a15d5c97f226e51dca044dfa144d248e5d7f56ee91607ce01a122b1fb908e2f"} Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.532247 4747 scope.go:117] "RemoveContainer" containerID="c6be803311a69c2d65e995b757337697144af4a7db42f720de3fad2e222b409b" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.533012 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.537670 4747 generic.go:334] "Generic (PLEG): container finished" podID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerID="9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89" exitCode=0 Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.537723 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerDied","Data":"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89"} Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.537755 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f8445b-d0bc-4130-8e3d-234a418c3fa5","Type":"ContainerDied","Data":"8ac4800e0be9c9f00aecff668f206609c8a45ccb35c0c54f86f31ed5956b9734"} Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.537762 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.612013 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.612185 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6") on node "crc" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.656743 4747 scope.go:117] "RemoveContainer" containerID="1c7baecd6e66073598df9799197e696fc05bec859704b5f7b6a8a261174be8a7" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.692543 4747 scope.go:117] "RemoveContainer" containerID="9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.708291 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") on node \"crc\" DevicePath \"\"" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.717227 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.717779 4747 scope.go:117] "RemoveContainer" containerID="ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.738320 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.746100 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.750133 4747 scope.go:117] "RemoveContainer" containerID="9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.752219 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89\": container with ID starting with 9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89 not found: ID does not exist" containerID="9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.752278 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89"} err="failed to get container status \"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89\": rpc error: code = NotFound desc = could not find container \"9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89\": container with ID starting with 9f99811aa7e0fe0259a6c9713dc37c42e627afaed253e41aaf9c533e8cc0cb89 not found: ID does not exist" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.752320 4747 scope.go:117] "RemoveContainer" containerID="ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.754047 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759437 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b\": container with ID starting with ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b not found: ID does not exist" containerID="ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759498 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b"} err="failed to get container status \"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b\": rpc error: code = NotFound desc = could not find container \"ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b\": container with ID starting with ae2f16594ec97e7aa8430e3d3bfebe0e1a085d81c5ce2079c6ff75a1f35d042b not found: ID does not exist" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759604 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759866 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="setup-container" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759878 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="setup-container" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759886 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759892 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759912 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="dnsmasq-dns" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759918 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="dnsmasq-dns" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759927 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759933 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759944 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="init" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759950 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="init" Dec 05 22:05:19 crc kubenswrapper[4747]: E1205 22:05:19.759959 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="setup-container" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.759965 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="setup-container" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.760084 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.760103 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" containerName="dnsmasq-dns" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.760117 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" containerName="rabbitmq" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.760812 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768631 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768719 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768760 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768832 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768839 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.768632 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.769153 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f444s" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.790683 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.803012 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.804505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808072 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808297 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808456 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808489 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9tfbg" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808602 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808719 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.808881 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.818027 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.847561 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a80203e-563e-46d6-a322-df8bdf64d509" path="/var/lib/kubelet/pods/4a80203e-563e-46d6-a322-df8bdf64d509/volumes" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.848291 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe84605-dd30-4d36-80d8-c9a11f25c186" path="/var/lib/kubelet/pods/cbe84605-dd30-4d36-80d8-c9a11f25c186/volumes" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.849528 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f8445b-d0bc-4130-8e3d-234a418c3fa5" path="/var/lib/kubelet/pods/f2f8445b-d0bc-4130-8e3d-234a418c3fa5/volumes" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-config-data\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910652 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910677 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910827 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a2f30d-7404-4264-830e-ef43a223735e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910896 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sbb\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-kube-api-access-57sbb\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910944 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.910987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911047 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911083 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a2f30d-7404-4264-830e-ef43a223735e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911150 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911182 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911203 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b49058e8-36ba-424d-a6bf-732f2b0545ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911250 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrtw\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-kube-api-access-tlrtw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911275 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911291 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911306 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b49058e8-36ba-424d-a6bf-732f2b0545ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911339 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:19 crc kubenswrapper[4747]: I1205 22:05:19.911395 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.012744 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.012883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-config-data\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.012930 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.012983 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013108 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013159 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a2f30d-7404-4264-830e-ef43a223735e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sbb\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-kube-api-access-57sbb\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013321 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.013710 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.014574 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.014708 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-config-data\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.016288 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.016331 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.016896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.016895 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.016985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a2f30d-7404-4264-830e-ef43a223735e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017044 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017084 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b49058e8-36ba-424d-a6bf-732f2b0545ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017149 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017181 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrtw\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-kube-api-access-tlrtw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017199 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3474c9400e053623cf942402e2ec88ace496279d93ff7b8788f812481327808/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017281 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017355 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017386 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b49058e8-36ba-424d-a6bf-732f2b0545ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017420 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.017473 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.018259 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a2f30d-7404-4264-830e-ef43a223735e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.019020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.019173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.020498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a2f30d-7404-4264-830e-ef43a223735e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.020871 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.021898 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b49058e8-36ba-424d-a6bf-732f2b0545ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.021911 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.023142 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.026430 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b49058e8-36ba-424d-a6bf-732f2b0545ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.027395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.027804 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a2f30d-7404-4264-830e-ef43a223735e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.032469 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.035110 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.035179 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e16f1934d34a5a3a7d98064185faecfe11b1c68091895ad6c06ceaca331876de/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.038457 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b49058e8-36ba-424d-a6bf-732f2b0545ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.051826 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sbb\" (UniqueName: \"kubernetes.io/projected/97a2f30d-7404-4264-830e-ef43a223735e-kube-api-access-57sbb\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.066454 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrtw\" (UniqueName: \"kubernetes.io/projected/b49058e8-36ba-424d-a6bf-732f2b0545ff-kube-api-access-tlrtw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.074040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3bbf1c4-7a6f-4676-97be-089d501fc0f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b49058e8-36ba-424d-a6bf-732f2b0545ff\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.082753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ea67a3a4-9839-4d29-b6b3-d831c7a9bb31\") pod \"rabbitmq-server-0\" (UID: \"97a2f30d-7404-4264-830e-ef43a223735e\") " pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.091027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.123181 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.351445 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.406267 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.547217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97a2f30d-7404-4264-830e-ef43a223735e","Type":"ContainerStarted","Data":"88bd3273962e513c6f075af56d8e3a04d11dde4aa05067419a60efe086bb26ad"} Dec 05 22:05:20 crc kubenswrapper[4747]: I1205 22:05:20.548270 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b49058e8-36ba-424d-a6bf-732f2b0545ff","Type":"ContainerStarted","Data":"0652375c810cc926e82768482e8be39b02c708188ad223a5305fc9d140ba3327"} Dec 05 22:05:22 crc kubenswrapper[4747]: I1205 22:05:22.565101 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97a2f30d-7404-4264-830e-ef43a223735e","Type":"ContainerStarted","Data":"b77921fcb835b5618f9f1debe906fc3e690f43aaeeb19fb1d02064e481a087fe"} Dec 05 22:05:22 crc kubenswrapper[4747]: I1205 22:05:22.567028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b49058e8-36ba-424d-a6bf-732f2b0545ff","Type":"ContainerStarted","Data":"faef9daabe61392bffed22e7e51b7ebc55ad424b5c9e0624539e1427d9b54260"} Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.222269 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.222817 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.222866 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.223499 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.223555 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" gracePeriod=600 Dec 05 22:05:36 crc kubenswrapper[4747]: E1205 22:05:36.350334 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.695769 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" exitCode=0 Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.695851 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804"} Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.695909 4747 scope.go:117] "RemoveContainer" containerID="1919542fc50eb5e519160ffa50e5548bea4940791a48a8736c96f1ec5c6138da" Dec 05 22:05:36 crc kubenswrapper[4747]: I1205 22:05:36.696868 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:05:36 crc kubenswrapper[4747]: E1205 22:05:36.697335 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.077709 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.081320 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.093807 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.161380 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.161473 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.161820 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knffb\" (UniqueName: \"kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.263562 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knffb\" (UniqueName: \"kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.263702 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.263755 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.264223 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.264316 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.290473 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knffb\" (UniqueName: \"kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb\") pod \"redhat-operators-pfrp2\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.412876 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:44 crc kubenswrapper[4747]: I1205 22:05:44.864046 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:05:45 crc kubenswrapper[4747]: I1205 22:05:45.769630 4747 generic.go:334] "Generic (PLEG): container finished" podID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerID="f361673d0c399c5b32b6598c65f79a6dfcb64220d334e77a771c663eb29de02d" exitCode=0 Dec 05 22:05:45 crc kubenswrapper[4747]: I1205 22:05:45.769688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerDied","Data":"f361673d0c399c5b32b6598c65f79a6dfcb64220d334e77a771c663eb29de02d"} Dec 05 22:05:45 crc kubenswrapper[4747]: I1205 22:05:45.769760 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerStarted","Data":"732ec68fd991997a0f76902142d5132f444f7ea07046ad9d11676eb4d60c4d43"} Dec 05 22:05:46 crc kubenswrapper[4747]: I1205 22:05:46.780648 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerStarted","Data":"fa33f784b94f1a5341061d155a6c3e70b1721857c437da7bd26e011ffd469bec"} Dec 05 22:05:47 crc kubenswrapper[4747]: I1205 22:05:47.792343 4747 generic.go:334] "Generic (PLEG): container finished" podID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerID="fa33f784b94f1a5341061d155a6c3e70b1721857c437da7bd26e011ffd469bec" exitCode=0 Dec 05 22:05:47 crc kubenswrapper[4747]: I1205 22:05:47.792433 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerDied","Data":"fa33f784b94f1a5341061d155a6c3e70b1721857c437da7bd26e011ffd469bec"} Dec 05 22:05:49 crc kubenswrapper[4747]: I1205 22:05:49.816766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerStarted","Data":"4fa7ada6720d53ec282b653af116b50ca43deecce3c0f4b4a8754378a27b0cee"} Dec 05 22:05:49 crc kubenswrapper[4747]: I1205 22:05:49.846482 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pfrp2" podStartSLOduration=3.385351626 podStartE2EDuration="5.846457517s" podCreationTimestamp="2025-12-05 22:05:44 +0000 UTC" firstStartedPulling="2025-12-05 22:05:45.772700978 +0000 UTC m=+5016.240008476" lastFinishedPulling="2025-12-05 22:05:48.233806849 +0000 UTC m=+5018.701114367" observedRunningTime="2025-12-05 22:05:49.836292653 +0000 UTC m=+5020.303600181" watchObservedRunningTime="2025-12-05 22:05:49.846457517 +0000 UTC m=+5020.313765045" Dec 05 22:05:50 crc kubenswrapper[4747]: I1205 22:05:50.839931 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:05:50 crc kubenswrapper[4747]: E1205 22:05:50.840202 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:05:54 crc kubenswrapper[4747]: I1205 22:05:54.413736 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:54 crc kubenswrapper[4747]: I1205 22:05:54.413990 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:05:55 crc kubenswrapper[4747]: I1205 22:05:55.464293 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pfrp2" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="registry-server" probeResult="failure" output=< Dec 05 22:05:55 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:05:55 crc kubenswrapper[4747]: > Dec 05 22:05:55 crc kubenswrapper[4747]: I1205 22:05:55.874091 4747 generic.go:334] "Generic (PLEG): container finished" podID="97a2f30d-7404-4264-830e-ef43a223735e" containerID="b77921fcb835b5618f9f1debe906fc3e690f43aaeeb19fb1d02064e481a087fe" exitCode=0 Dec 05 22:05:55 crc kubenswrapper[4747]: I1205 22:05:55.874313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97a2f30d-7404-4264-830e-ef43a223735e","Type":"ContainerDied","Data":"b77921fcb835b5618f9f1debe906fc3e690f43aaeeb19fb1d02064e481a087fe"} Dec 05 22:05:55 crc kubenswrapper[4747]: I1205 22:05:55.876719 4747 generic.go:334] "Generic (PLEG): container finished" podID="b49058e8-36ba-424d-a6bf-732f2b0545ff" containerID="faef9daabe61392bffed22e7e51b7ebc55ad424b5c9e0624539e1427d9b54260" exitCode=0 Dec 05 22:05:55 crc kubenswrapper[4747]: I1205 22:05:55.876764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b49058e8-36ba-424d-a6bf-732f2b0545ff","Type":"ContainerDied","Data":"faef9daabe61392bffed22e7e51b7ebc55ad424b5c9e0624539e1427d9b54260"} Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.887981 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b49058e8-36ba-424d-a6bf-732f2b0545ff","Type":"ContainerStarted","Data":"589c441517b3aaaeb897465877877e67c4780171fbd7ee3d264b09f39f57564a"} Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.888782 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.892156 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97a2f30d-7404-4264-830e-ef43a223735e","Type":"ContainerStarted","Data":"dbeb084be369df91a7c0d732e9d5cc769ff26cd887545ca46b2284b007eeff53"} Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.892434 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.919782 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.919757113 podStartE2EDuration="37.919757113s" podCreationTimestamp="2025-12-05 22:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:05:56.911252271 +0000 UTC m=+5027.378559799" watchObservedRunningTime="2025-12-05 22:05:56.919757113 +0000 UTC m=+5027.387064641" Dec 05 22:05:56 crc kubenswrapper[4747]: I1205 22:05:56.955894 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.955868265 podStartE2EDuration="37.955868265s" podCreationTimestamp="2025-12-05 22:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:05:56.93763164 +0000 UTC m=+5027.404939188" watchObservedRunningTime="2025-12-05 22:05:56.955868265 +0000 UTC m=+5027.423175793" Dec 05 22:06:01 crc kubenswrapper[4747]: I1205 22:06:01.992975 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:01 crc kubenswrapper[4747]: I1205 22:06:01.998660 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:01 crc kubenswrapper[4747]: I1205 22:06:01.998769 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.077222 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.077317 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.077384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvl8\" (UniqueName: \"kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.178757 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.178831 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.178865 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvl8\" (UniqueName: \"kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.179311 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.179416 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.197882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvl8\" (UniqueName: \"kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8\") pod \"certified-operators-vn4qv\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.381358 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.818498 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.839707 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:06:02 crc kubenswrapper[4747]: E1205 22:06:02.839951 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:06:02 crc kubenswrapper[4747]: I1205 22:06:02.959359 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerStarted","Data":"fa4c0d269203a142e471dda2bdcd309ad3a92a89d32453e7fd52bc9028447f83"} Dec 05 22:06:03 crc kubenswrapper[4747]: I1205 22:06:03.968479 4747 generic.go:334] "Generic (PLEG): container finished" podID="e1d00912-4529-40b8-b381-9c5217022cb6" containerID="851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d" exitCode=0 Dec 05 22:06:03 crc kubenswrapper[4747]: I1205 22:06:03.968537 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerDied","Data":"851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d"} Dec 05 22:06:04 crc kubenswrapper[4747]: I1205 22:06:04.465007 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:06:04 crc kubenswrapper[4747]: I1205 22:06:04.533838 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:06:04 crc kubenswrapper[4747]: I1205 22:06:04.981229 4747 generic.go:334] "Generic (PLEG): container finished" podID="e1d00912-4529-40b8-b381-9c5217022cb6" containerID="72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98" exitCode=0 Dec 05 22:06:04 crc kubenswrapper[4747]: I1205 22:06:04.981331 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerDied","Data":"72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98"} Dec 05 22:06:06 crc kubenswrapper[4747]: I1205 22:06:06.016000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerStarted","Data":"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4"} Dec 05 22:06:06 crc kubenswrapper[4747]: I1205 22:06:06.054901 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vn4qv" podStartSLOduration=3.283022662 podStartE2EDuration="5.054870607s" podCreationTimestamp="2025-12-05 22:06:01 +0000 UTC" firstStartedPulling="2025-12-05 22:06:03.970778423 +0000 UTC m=+5034.438085911" lastFinishedPulling="2025-12-05 22:06:05.742626328 +0000 UTC m=+5036.209933856" observedRunningTime="2025-12-05 22:06:06.042714113 +0000 UTC m=+5036.510021631" watchObservedRunningTime="2025-12-05 22:06:06.054870607 +0000 UTC m=+5036.522178135" Dec 05 22:06:06 crc kubenswrapper[4747]: I1205 22:06:06.743098 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:06:06 crc kubenswrapper[4747]: I1205 22:06:06.743453 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pfrp2" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="registry-server" containerID="cri-o://4fa7ada6720d53ec282b653af116b50ca43deecce3c0f4b4a8754378a27b0cee" gracePeriod=2 Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.029191 4747 generic.go:334] "Generic (PLEG): container finished" podID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerID="4fa7ada6720d53ec282b653af116b50ca43deecce3c0f4b4a8754378a27b0cee" exitCode=0 Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.029572 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerDied","Data":"4fa7ada6720d53ec282b653af116b50ca43deecce3c0f4b4a8754378a27b0cee"} Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.164138 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.364054 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities\") pod \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.364194 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knffb\" (UniqueName: \"kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb\") pod \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.364386 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content\") pod \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\" (UID: \"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0\") " Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.365018 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities" (OuterVolumeSpecName: "utilities") pod "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" (UID: "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.376833 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb" (OuterVolumeSpecName: "kube-api-access-knffb") pod "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" (UID: "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0"). InnerVolumeSpecName "kube-api-access-knffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.467100 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.467226 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knffb\" (UniqueName: \"kubernetes.io/projected/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-kube-api-access-knffb\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.476339 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" (UID: "16ef60a1-65cc-4e1d-8a63-4467ffaef3c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:07 crc kubenswrapper[4747]: I1205 22:06:07.568192 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.043272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfrp2" event={"ID":"16ef60a1-65cc-4e1d-8a63-4467ffaef3c0","Type":"ContainerDied","Data":"732ec68fd991997a0f76902142d5132f444f7ea07046ad9d11676eb4d60c4d43"} Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.043340 4747 scope.go:117] "RemoveContainer" containerID="4fa7ada6720d53ec282b653af116b50ca43deecce3c0f4b4a8754378a27b0cee" Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.043341 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfrp2" Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.076874 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.081351 4747 scope.go:117] "RemoveContainer" containerID="fa33f784b94f1a5341061d155a6c3e70b1721857c437da7bd26e011ffd469bec" Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.084944 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pfrp2"] Dec 05 22:06:08 crc kubenswrapper[4747]: I1205 22:06:08.111268 4747 scope.go:117] "RemoveContainer" containerID="f361673d0c399c5b32b6598c65f79a6dfcb64220d334e77a771c663eb29de02d" Dec 05 22:06:09 crc kubenswrapper[4747]: I1205 22:06:09.855966 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" path="/var/lib/kubelet/pods/16ef60a1-65cc-4e1d-8a63-4467ffaef3c0/volumes" Dec 05 22:06:10 crc kubenswrapper[4747]: I1205 22:06:10.094814 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 22:06:10 crc kubenswrapper[4747]: I1205 22:06:10.128819 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 22:06:12 crc kubenswrapper[4747]: I1205 22:06:12.381759 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:12 crc kubenswrapper[4747]: I1205 22:06:12.382352 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:12 crc kubenswrapper[4747]: I1205 22:06:12.463249 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.177069 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.234014 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.574758 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 22:06:13 crc kubenswrapper[4747]: E1205 22:06:13.575986 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="extract-utilities" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.576100 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="extract-utilities" Dec 05 22:06:13 crc kubenswrapper[4747]: E1205 22:06:13.576160 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="extract-content" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.576178 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="extract-content" Dec 05 22:06:13 crc kubenswrapper[4747]: E1205 22:06:13.576210 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="registry-server" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.576228 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="registry-server" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.578913 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ef60a1-65cc-4e1d-8a63-4467ffaef3c0" containerName="registry-server" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.579843 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.589277 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7jmpv" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.593749 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.771811 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhf87\" (UniqueName: \"kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87\") pod \"mariadb-client-1-default\" (UID: \"42143c8e-db0b-47b6-b569-fd21f299eb45\") " pod="openstack/mariadb-client-1-default" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.873757 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhf87\" (UniqueName: \"kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87\") pod \"mariadb-client-1-default\" (UID: \"42143c8e-db0b-47b6-b569-fd21f299eb45\") " pod="openstack/mariadb-client-1-default" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.908852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhf87\" (UniqueName: \"kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87\") pod \"mariadb-client-1-default\" (UID: \"42143c8e-db0b-47b6-b569-fd21f299eb45\") " pod="openstack/mariadb-client-1-default" Dec 05 22:06:13 crc kubenswrapper[4747]: I1205 22:06:13.912518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 22:06:14 crc kubenswrapper[4747]: W1205 22:06:14.464270 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42143c8e_db0b_47b6_b569_fd21f299eb45.slice/crio-8fd2ff8641e864cd9b1140126c9f9f5941548830d30b2368da3f2866d2bbba20 WatchSource:0}: Error finding container 8fd2ff8641e864cd9b1140126c9f9f5941548830d30b2368da3f2866d2bbba20: Status 404 returned error can't find the container with id 8fd2ff8641e864cd9b1140126c9f9f5941548830d30b2368da3f2866d2bbba20 Dec 05 22:06:14 crc kubenswrapper[4747]: I1205 22:06:14.472949 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 22:06:15 crc kubenswrapper[4747]: I1205 22:06:15.116400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"42143c8e-db0b-47b6-b569-fd21f299eb45","Type":"ContainerStarted","Data":"8fd2ff8641e864cd9b1140126c9f9f5941548830d30b2368da3f2866d2bbba20"} Dec 05 22:06:15 crc kubenswrapper[4747]: I1205 22:06:15.116732 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vn4qv" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="registry-server" containerID="cri-o://9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4" gracePeriod=2 Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.057552 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.117745 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.118141 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="extract-content" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.118159 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="extract-content" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.118189 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="extract-utilities" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.118198 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="extract-utilities" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.118215 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="registry-server" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.118223 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="registry-server" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.118416 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" containerName="registry-server" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.119873 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.131783 4747 generic.go:334] "Generic (PLEG): container finished" podID="e1d00912-4529-40b8-b381-9c5217022cb6" containerID="9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4" exitCode=0 Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.131848 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerDied","Data":"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4"} Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.131875 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vn4qv" event={"ID":"e1d00912-4529-40b8-b381-9c5217022cb6","Type":"ContainerDied","Data":"fa4c0d269203a142e471dda2bdcd309ad3a92a89d32453e7fd52bc9028447f83"} Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.131891 4747 scope.go:117] "RemoveContainer" containerID="9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.131895 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vn4qv" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.134842 4747 generic.go:334] "Generic (PLEG): container finished" podID="42143c8e-db0b-47b6-b569-fd21f299eb45" containerID="111f248c4df7d8e02b4c376b05000359aeba10bb8c14fa99227e2270da85e0d8" exitCode=0 Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.134921 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"42143c8e-db0b-47b6-b569-fd21f299eb45","Type":"ContainerDied","Data":"111f248c4df7d8e02b4c376b05000359aeba10bb8c14fa99227e2270da85e0d8"} Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.155269 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.171394 4747 scope.go:117] "RemoveContainer" containerID="72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.196475 4747 scope.go:117] "RemoveContainer" containerID="851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.214659 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content\") pod \"e1d00912-4529-40b8-b381-9c5217022cb6\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.214788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities\") pod \"e1d00912-4529-40b8-b381-9c5217022cb6\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.214826 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvl8\" (UniqueName: \"kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8\") pod \"e1d00912-4529-40b8-b381-9c5217022cb6\" (UID: \"e1d00912-4529-40b8-b381-9c5217022cb6\") " Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.215208 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.215282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.215391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fctk8\" (UniqueName: \"kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.215790 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities" (OuterVolumeSpecName: "utilities") pod "e1d00912-4529-40b8-b381-9c5217022cb6" (UID: "e1d00912-4529-40b8-b381-9c5217022cb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.223828 4747 scope.go:117] "RemoveContainer" containerID="9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.224235 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4\": container with ID starting with 9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4 not found: ID does not exist" containerID="9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.224281 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4"} err="failed to get container status \"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4\": rpc error: code = NotFound desc = could not find container \"9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4\": container with ID starting with 9e0d6752d85a4461bc4a975c028f6f9935ed186f3fa40c6cc179ae0633fdc5c4 not found: ID does not exist" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.224305 4747 scope.go:117] "RemoveContainer" containerID="72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.224595 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98\": container with ID starting with 72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98 not found: ID does not exist" containerID="72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.224612 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98"} err="failed to get container status \"72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98\": rpc error: code = NotFound desc = could not find container \"72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98\": container with ID starting with 72259c19259a836d748dd56b017312a64adc7971e34eaa045922cde2e2686f98 not found: ID does not exist" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.224625 4747 scope.go:117] "RemoveContainer" containerID="851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.224876 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d\": container with ID starting with 851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d not found: ID does not exist" containerID="851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.224909 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d"} err="failed to get container status \"851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d\": rpc error: code = NotFound desc = could not find container \"851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d\": container with ID starting with 851fc0dee7064ef7ac82b9ab59892c25c372103ab8ac9a80f8493994a244de8d not found: ID does not exist" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.225244 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8" (OuterVolumeSpecName: "kube-api-access-lrvl8") pod "e1d00912-4529-40b8-b381-9c5217022cb6" (UID: "e1d00912-4529-40b8-b381-9c5217022cb6"). InnerVolumeSpecName "kube-api-access-lrvl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.278527 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1d00912-4529-40b8-b381-9c5217022cb6" (UID: "e1d00912-4529-40b8-b381-9c5217022cb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317084 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317242 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317369 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fctk8\" (UniqueName: \"kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317472 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317530 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d00912-4529-40b8-b381-9c5217022cb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317548 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvl8\" (UniqueName: \"kubernetes.io/projected/e1d00912-4529-40b8-b381-9c5217022cb6-kube-api-access-lrvl8\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317696 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.317802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.333601 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fctk8\" (UniqueName: \"kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8\") pod \"redhat-marketplace-2jdz2\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.438117 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.468942 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.475333 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vn4qv"] Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.840205 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:06:16 crc kubenswrapper[4747]: E1205 22:06:16.840744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:06:16 crc kubenswrapper[4747]: I1205 22:06:16.890058 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:16 crc kubenswrapper[4747]: W1205 22:06:16.900542 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ed4a4a_555d_44ee_be75_b451f4c3cf42.slice/crio-5328a72af8084792065650c992df5d799b9ffd7c91de38a44d13fd48538c4045 WatchSource:0}: Error finding container 5328a72af8084792065650c992df5d799b9ffd7c91de38a44d13fd48538c4045: Status 404 returned error can't find the container with id 5328a72af8084792065650c992df5d799b9ffd7c91de38a44d13fd48538c4045 Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.143654 4747 generic.go:334] "Generic (PLEG): container finished" podID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerID="99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c" exitCode=0 Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.143733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerDied","Data":"99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c"} Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.143760 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerStarted","Data":"5328a72af8084792065650c992df5d799b9ffd7c91de38a44d13fd48538c4045"} Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.619449 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.646028 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_42143c8e-db0b-47b6-b569-fd21f299eb45/mariadb-client-1-default/0.log" Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.681683 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.691443 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.740129 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhf87\" (UniqueName: \"kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87\") pod \"42143c8e-db0b-47b6-b569-fd21f299eb45\" (UID: \"42143c8e-db0b-47b6-b569-fd21f299eb45\") " Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.749274 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87" (OuterVolumeSpecName: "kube-api-access-dhf87") pod "42143c8e-db0b-47b6-b569-fd21f299eb45" (UID: "42143c8e-db0b-47b6-b569-fd21f299eb45"). InnerVolumeSpecName "kube-api-access-dhf87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.842773 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhf87\" (UniqueName: \"kubernetes.io/projected/42143c8e-db0b-47b6-b569-fd21f299eb45-kube-api-access-dhf87\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.863862 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42143c8e-db0b-47b6-b569-fd21f299eb45" path="/var/lib/kubelet/pods/42143c8e-db0b-47b6-b569-fd21f299eb45/volumes" Dec 05 22:06:17 crc kubenswrapper[4747]: I1205 22:06:17.864929 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d00912-4529-40b8-b381-9c5217022cb6" path="/var/lib/kubelet/pods/e1d00912-4529-40b8-b381-9c5217022cb6/volumes" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.091386 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 22:06:18 crc kubenswrapper[4747]: E1205 22:06:18.091841 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42143c8e-db0b-47b6-b569-fd21f299eb45" containerName="mariadb-client-1-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.091869 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="42143c8e-db0b-47b6-b569-fd21f299eb45" containerName="mariadb-client-1-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.092117 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="42143c8e-db0b-47b6-b569-fd21f299eb45" containerName="mariadb-client-1-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.092719 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.111207 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.171929 4747 generic.go:334] "Generic (PLEG): container finished" podID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerID="ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9" exitCode=0 Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.172065 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerDied","Data":"ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9"} Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.180328 4747 scope.go:117] "RemoveContainer" containerID="111f248c4df7d8e02b4c376b05000359aeba10bb8c14fa99227e2270da85e0d8" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.180458 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.248479 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrvvb\" (UniqueName: \"kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb\") pod \"mariadb-client-2-default\" (UID: \"a5a5b192-9b58-441e-95ef-92f19df82414\") " pod="openstack/mariadb-client-2-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.351413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrvvb\" (UniqueName: \"kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb\") pod \"mariadb-client-2-default\" (UID: \"a5a5b192-9b58-441e-95ef-92f19df82414\") " pod="openstack/mariadb-client-2-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.387294 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrvvb\" (UniqueName: \"kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb\") pod \"mariadb-client-2-default\" (UID: \"a5a5b192-9b58-441e-95ef-92f19df82414\") " pod="openstack/mariadb-client-2-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.418834 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 22:06:18 crc kubenswrapper[4747]: I1205 22:06:18.769159 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 22:06:19 crc kubenswrapper[4747]: I1205 22:06:19.192499 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerStarted","Data":"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9"} Dec 05 22:06:19 crc kubenswrapper[4747]: I1205 22:06:19.194302 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"a5a5b192-9b58-441e-95ef-92f19df82414","Type":"ContainerStarted","Data":"17de64235dcef77b7369a7197a33dde23bb25a915fd7170b875e137cff7c84c2"} Dec 05 22:06:19 crc kubenswrapper[4747]: I1205 22:06:19.194373 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"a5a5b192-9b58-441e-95ef-92f19df82414","Type":"ContainerStarted","Data":"dd51ac62a7f30885bd536f86258ab28ad376781c3e82d3e9b1270a85c6969a6d"} Dec 05 22:06:19 crc kubenswrapper[4747]: I1205 22:06:19.219455 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2jdz2" podStartSLOduration=1.684678871 podStartE2EDuration="3.219432253s" podCreationTimestamp="2025-12-05 22:06:16 +0000 UTC" firstStartedPulling="2025-12-05 22:06:17.145468463 +0000 UTC m=+5047.612775991" lastFinishedPulling="2025-12-05 22:06:18.680221885 +0000 UTC m=+5049.147529373" observedRunningTime="2025-12-05 22:06:19.214391177 +0000 UTC m=+5049.681698685" watchObservedRunningTime="2025-12-05 22:06:19.219432253 +0000 UTC m=+5049.686739781" Dec 05 22:06:19 crc kubenswrapper[4747]: I1205 22:06:19.246036 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.246019907 podStartE2EDuration="1.246019907s" podCreationTimestamp="2025-12-05 22:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:06:19.244227902 +0000 UTC m=+5049.711535410" watchObservedRunningTime="2025-12-05 22:06:19.246019907 +0000 UTC m=+5049.713327405" Dec 05 22:06:20 crc kubenswrapper[4747]: I1205 22:06:20.204518 4747 generic.go:334] "Generic (PLEG): container finished" podID="a5a5b192-9b58-441e-95ef-92f19df82414" containerID="17de64235dcef77b7369a7197a33dde23bb25a915fd7170b875e137cff7c84c2" exitCode=1 Dec 05 22:06:20 crc kubenswrapper[4747]: I1205 22:06:20.204625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"a5a5b192-9b58-441e-95ef-92f19df82414","Type":"ContainerDied","Data":"17de64235dcef77b7369a7197a33dde23bb25a915fd7170b875e137cff7c84c2"} Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.615409 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.657010 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.665377 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.810225 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrvvb\" (UniqueName: \"kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb\") pod \"a5a5b192-9b58-441e-95ef-92f19df82414\" (UID: \"a5a5b192-9b58-441e-95ef-92f19df82414\") " Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.819889 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb" (OuterVolumeSpecName: "kube-api-access-jrvvb") pod "a5a5b192-9b58-441e-95ef-92f19df82414" (UID: "a5a5b192-9b58-441e-95ef-92f19df82414"). InnerVolumeSpecName "kube-api-access-jrvvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.855644 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a5b192-9b58-441e-95ef-92f19df82414" path="/var/lib/kubelet/pods/a5a5b192-9b58-441e-95ef-92f19df82414/volumes" Dec 05 22:06:21 crc kubenswrapper[4747]: I1205 22:06:21.912839 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrvvb\" (UniqueName: \"kubernetes.io/projected/a5a5b192-9b58-441e-95ef-92f19df82414-kube-api-access-jrvvb\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.051051 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 05 22:06:22 crc kubenswrapper[4747]: E1205 22:06:22.051528 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a5b192-9b58-441e-95ef-92f19df82414" containerName="mariadb-client-2-default" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.051550 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a5b192-9b58-441e-95ef-92f19df82414" containerName="mariadb-client-2-default" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.051884 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a5b192-9b58-441e-95ef-92f19df82414" containerName="mariadb-client-2-default" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.052815 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.057797 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.217065 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2c9p\" (UniqueName: \"kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p\") pod \"mariadb-client-1\" (UID: \"4771a93d-9bf4-49c1-b70a-40d7ffaecbda\") " pod="openstack/mariadb-client-1" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.225915 4747 scope.go:117] "RemoveContainer" containerID="17de64235dcef77b7369a7197a33dde23bb25a915fd7170b875e137cff7c84c2" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.225984 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.319072 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c9p\" (UniqueName: \"kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p\") pod \"mariadb-client-1\" (UID: \"4771a93d-9bf4-49c1-b70a-40d7ffaecbda\") " pod="openstack/mariadb-client-1" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.336970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c9p\" (UniqueName: \"kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p\") pod \"mariadb-client-1\" (UID: \"4771a93d-9bf4-49c1-b70a-40d7ffaecbda\") " pod="openstack/mariadb-client-1" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.423149 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 22:06:22 crc kubenswrapper[4747]: I1205 22:06:22.697750 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 22:06:22 crc kubenswrapper[4747]: W1205 22:06:22.703314 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4771a93d_9bf4_49c1_b70a_40d7ffaecbda.slice/crio-16cef86f04c4eca890ea16598825f94f09cd438cecfe966fc2baea0ff9317747 WatchSource:0}: Error finding container 16cef86f04c4eca890ea16598825f94f09cd438cecfe966fc2baea0ff9317747: Status 404 returned error can't find the container with id 16cef86f04c4eca890ea16598825f94f09cd438cecfe966fc2baea0ff9317747 Dec 05 22:06:23 crc kubenswrapper[4747]: I1205 22:06:23.239755 4747 generic.go:334] "Generic (PLEG): container finished" podID="4771a93d-9bf4-49c1-b70a-40d7ffaecbda" containerID="e8fd5fcb110707081f3ec156a060c9832a3f67461b11fa636f22384afddc4870" exitCode=0 Dec 05 22:06:23 crc kubenswrapper[4747]: I1205 22:06:23.239820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"4771a93d-9bf4-49c1-b70a-40d7ffaecbda","Type":"ContainerDied","Data":"e8fd5fcb110707081f3ec156a060c9832a3f67461b11fa636f22384afddc4870"} Dec 05 22:06:23 crc kubenswrapper[4747]: I1205 22:06:23.239875 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"4771a93d-9bf4-49c1-b70a-40d7ffaecbda","Type":"ContainerStarted","Data":"16cef86f04c4eca890ea16598825f94f09cd438cecfe966fc2baea0ff9317747"} Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.643477 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.664157 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_4771a93d-9bf4-49c1-b70a-40d7ffaecbda/mariadb-client-1/0.log" Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.664371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2c9p\" (UniqueName: \"kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p\") pod \"4771a93d-9bf4-49c1-b70a-40d7ffaecbda\" (UID: \"4771a93d-9bf4-49c1-b70a-40d7ffaecbda\") " Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.671315 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p" (OuterVolumeSpecName: "kube-api-access-s2c9p") pod "4771a93d-9bf4-49c1-b70a-40d7ffaecbda" (UID: "4771a93d-9bf4-49c1-b70a-40d7ffaecbda"). InnerVolumeSpecName "kube-api-access-s2c9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.691049 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.696350 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 22:06:24 crc kubenswrapper[4747]: I1205 22:06:24.766360 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2c9p\" (UniqueName: \"kubernetes.io/projected/4771a93d-9bf4-49c1-b70a-40d7ffaecbda-kube-api-access-s2c9p\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.108712 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 22:06:25 crc kubenswrapper[4747]: E1205 22:06:25.109033 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4771a93d-9bf4-49c1-b70a-40d7ffaecbda" containerName="mariadb-client-1" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.109050 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4771a93d-9bf4-49c1-b70a-40d7ffaecbda" containerName="mariadb-client-1" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.109195 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4771a93d-9bf4-49c1-b70a-40d7ffaecbda" containerName="mariadb-client-1" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.109723 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.126022 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.172784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gbq\" (UniqueName: \"kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq\") pod \"mariadb-client-4-default\" (UID: \"d03629cb-775b-45b6-8cac-4f62b0638e09\") " pod="openstack/mariadb-client-4-default" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.270824 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cef86f04c4eca890ea16598825f94f09cd438cecfe966fc2baea0ff9317747" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.270881 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.274674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gbq\" (UniqueName: \"kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq\") pod \"mariadb-client-4-default\" (UID: \"d03629cb-775b-45b6-8cac-4f62b0638e09\") " pod="openstack/mariadb-client-4-default" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.297625 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gbq\" (UniqueName: \"kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq\") pod \"mariadb-client-4-default\" (UID: \"d03629cb-775b-45b6-8cac-4f62b0638e09\") " pod="openstack/mariadb-client-4-default" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.426874 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 22:06:25 crc kubenswrapper[4747]: I1205 22:06:25.851664 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4771a93d-9bf4-49c1-b70a-40d7ffaecbda" path="/var/lib/kubelet/pods/4771a93d-9bf4-49c1-b70a-40d7ffaecbda/volumes" Dec 05 22:06:26 crc kubenswrapper[4747]: I1205 22:06:26.022542 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 22:06:26 crc kubenswrapper[4747]: I1205 22:06:26.282500 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d03629cb-775b-45b6-8cac-4f62b0638e09","Type":"ContainerStarted","Data":"324b350a1b67b036fac03465e6d628dd5f1498a032a1017a7768d5231d3f4150"} Dec 05 22:06:26 crc kubenswrapper[4747]: I1205 22:06:26.439277 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:26 crc kubenswrapper[4747]: I1205 22:06:26.439383 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:26 crc kubenswrapper[4747]: I1205 22:06:26.517508 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:27 crc kubenswrapper[4747]: I1205 22:06:27.292912 4747 generic.go:334] "Generic (PLEG): container finished" podID="d03629cb-775b-45b6-8cac-4f62b0638e09" containerID="261c0542851e335016bf25fef1ec464158ad43e301561f58b5dd12573d8fca9f" exitCode=0 Dec 05 22:06:27 crc kubenswrapper[4747]: I1205 22:06:27.292992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d03629cb-775b-45b6-8cac-4f62b0638e09","Type":"ContainerDied","Data":"261c0542851e335016bf25fef1ec464158ad43e301561f58b5dd12573d8fca9f"} Dec 05 22:06:27 crc kubenswrapper[4747]: I1205 22:06:27.372037 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:27 crc kubenswrapper[4747]: I1205 22:06:27.433436 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.871977 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.889392 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_d03629cb-775b-45b6-8cac-4f62b0638e09/mariadb-client-4-default/0.log" Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.910755 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.917684 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.941910 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5gbq\" (UniqueName: \"kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq\") pod \"d03629cb-775b-45b6-8cac-4f62b0638e09\" (UID: \"d03629cb-775b-45b6-8cac-4f62b0638e09\") " Dec 05 22:06:28 crc kubenswrapper[4747]: I1205 22:06:28.949067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq" (OuterVolumeSpecName: "kube-api-access-w5gbq") pod "d03629cb-775b-45b6-8cac-4f62b0638e09" (UID: "d03629cb-775b-45b6-8cac-4f62b0638e09"). InnerVolumeSpecName "kube-api-access-w5gbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.043135 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5gbq\" (UniqueName: \"kubernetes.io/projected/d03629cb-775b-45b6-8cac-4f62b0638e09-kube-api-access-w5gbq\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.315128 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2jdz2" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="registry-server" containerID="cri-o://816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9" gracePeriod=2 Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.315727 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.315923 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="324b350a1b67b036fac03465e6d628dd5f1498a032a1017a7768d5231d3f4150" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.808959 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.852981 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03629cb-775b-45b6-8cac-4f62b0638e09" path="/var/lib/kubelet/pods/d03629cb-775b-45b6-8cac-4f62b0638e09/volumes" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.854828 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fctk8\" (UniqueName: \"kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8\") pod \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.854973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities\") pod \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.855064 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content\") pod \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\" (UID: \"c1ed4a4a-555d-44ee-be75-b451f4c3cf42\") " Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.856742 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities" (OuterVolumeSpecName: "utilities") pod "c1ed4a4a-555d-44ee-be75-b451f4c3cf42" (UID: "c1ed4a4a-555d-44ee-be75-b451f4c3cf42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.859337 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8" (OuterVolumeSpecName: "kube-api-access-fctk8") pod "c1ed4a4a-555d-44ee-be75-b451f4c3cf42" (UID: "c1ed4a4a-555d-44ee-be75-b451f4c3cf42"). InnerVolumeSpecName "kube-api-access-fctk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.883048 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1ed4a4a-555d-44ee-be75-b451f4c3cf42" (UID: "c1ed4a4a-555d-44ee-be75-b451f4c3cf42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.956906 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.956946 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:29 crc kubenswrapper[4747]: I1205 22:06:29.956957 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fctk8\" (UniqueName: \"kubernetes.io/projected/c1ed4a4a-555d-44ee-be75-b451f4c3cf42-kube-api-access-fctk8\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.330392 4747 generic.go:334] "Generic (PLEG): container finished" podID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerID="816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9" exitCode=0 Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.330462 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jdz2" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.330472 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerDied","Data":"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9"} Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.331123 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jdz2" event={"ID":"c1ed4a4a-555d-44ee-be75-b451f4c3cf42","Type":"ContainerDied","Data":"5328a72af8084792065650c992df5d799b9ffd7c91de38a44d13fd48538c4045"} Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.331164 4747 scope.go:117] "RemoveContainer" containerID="816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.409870 4747 scope.go:117] "RemoveContainer" containerID="ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.417279 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.428381 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jdz2"] Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.439827 4747 scope.go:117] "RemoveContainer" containerID="99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.476702 4747 scope.go:117] "RemoveContainer" containerID="816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9" Dec 05 22:06:30 crc kubenswrapper[4747]: E1205 22:06:30.477277 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9\": container with ID starting with 816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9 not found: ID does not exist" containerID="816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.477307 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9"} err="failed to get container status \"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9\": rpc error: code = NotFound desc = could not find container \"816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9\": container with ID starting with 816ea25e80e58065cb2906ff90ec80c4395c24796c2c5395bf705c828812e2b9 not found: ID does not exist" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.477328 4747 scope.go:117] "RemoveContainer" containerID="ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9" Dec 05 22:06:30 crc kubenswrapper[4747]: E1205 22:06:30.477838 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9\": container with ID starting with ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9 not found: ID does not exist" containerID="ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.477858 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9"} err="failed to get container status \"ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9\": rpc error: code = NotFound desc = could not find container \"ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9\": container with ID starting with ec0a193095e00e30cc15c66ba9ff41a66f13b7e381a94432912efb0eee2e5ca9 not found: ID does not exist" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.477871 4747 scope.go:117] "RemoveContainer" containerID="99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c" Dec 05 22:06:30 crc kubenswrapper[4747]: E1205 22:06:30.478457 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c\": container with ID starting with 99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c not found: ID does not exist" containerID="99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.478522 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c"} err="failed to get container status \"99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c\": rpc error: code = NotFound desc = could not find container \"99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c\": container with ID starting with 99a8cb4a39139dcbabbcbaa28c880ed3f6d39b7dafab6f27bda285b6474b4b6c not found: ID does not exist" Dec 05 22:06:30 crc kubenswrapper[4747]: I1205 22:06:30.839657 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:06:30 crc kubenswrapper[4747]: E1205 22:06:30.840163 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:06:31 crc kubenswrapper[4747]: I1205 22:06:31.853868 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" path="/var/lib/kubelet/pods/c1ed4a4a-555d-44ee-be75-b451f4c3cf42/volumes" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.610230 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 22:06:32 crc kubenswrapper[4747]: E1205 22:06:32.611000 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="extract-utilities" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611018 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="extract-utilities" Dec 05 22:06:32 crc kubenswrapper[4747]: E1205 22:06:32.611045 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="extract-content" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611052 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="extract-content" Dec 05 22:06:32 crc kubenswrapper[4747]: E1205 22:06:32.611061 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="registry-server" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611068 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="registry-server" Dec 05 22:06:32 crc kubenswrapper[4747]: E1205 22:06:32.611085 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03629cb-775b-45b6-8cac-4f62b0638e09" containerName="mariadb-client-4-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611091 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03629cb-775b-45b6-8cac-4f62b0638e09" containerName="mariadb-client-4-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611237 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03629cb-775b-45b6-8cac-4f62b0638e09" containerName="mariadb-client-4-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611251 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ed4a4a-555d-44ee-be75-b451f4c3cf42" containerName="registry-server" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.611822 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.614975 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7jmpv" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.632650 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.804205 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6hm\" (UniqueName: \"kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm\") pod \"mariadb-client-5-default\" (UID: \"12dac4d6-4bf8-465e-803a-bd2360fc3812\") " pod="openstack/mariadb-client-5-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.906380 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6hm\" (UniqueName: \"kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm\") pod \"mariadb-client-5-default\" (UID: \"12dac4d6-4bf8-465e-803a-bd2360fc3812\") " pod="openstack/mariadb-client-5-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.941899 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6hm\" (UniqueName: \"kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm\") pod \"mariadb-client-5-default\" (UID: \"12dac4d6-4bf8-465e-803a-bd2360fc3812\") " pod="openstack/mariadb-client-5-default" Dec 05 22:06:32 crc kubenswrapper[4747]: I1205 22:06:32.945540 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 22:06:33 crc kubenswrapper[4747]: I1205 22:06:33.451828 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 22:06:34 crc kubenswrapper[4747]: I1205 22:06:34.379808 4747 generic.go:334] "Generic (PLEG): container finished" podID="12dac4d6-4bf8-465e-803a-bd2360fc3812" containerID="941b2eb52a94b9dde863dddd6ef688781d652e7564516f60e67db23a1e556337" exitCode=0 Dec 05 22:06:34 crc kubenswrapper[4747]: I1205 22:06:34.379900 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"12dac4d6-4bf8-465e-803a-bd2360fc3812","Type":"ContainerDied","Data":"941b2eb52a94b9dde863dddd6ef688781d652e7564516f60e67db23a1e556337"} Dec 05 22:06:34 crc kubenswrapper[4747]: I1205 22:06:34.380121 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"12dac4d6-4bf8-465e-803a-bd2360fc3812","Type":"ContainerStarted","Data":"062bf19dddacdb734da9002520f4e03b8093248c730aa100b2e04927a50518f8"} Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.834284 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.856549 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_12dac4d6-4bf8-465e-803a-bd2360fc3812/mariadb-client-5-default/0.log" Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.883890 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.889382 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.958418 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6hm\" (UniqueName: \"kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm\") pod \"12dac4d6-4bf8-465e-803a-bd2360fc3812\" (UID: \"12dac4d6-4bf8-465e-803a-bd2360fc3812\") " Dec 05 22:06:35 crc kubenswrapper[4747]: I1205 22:06:35.964403 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm" (OuterVolumeSpecName: "kube-api-access-qj6hm") pod "12dac4d6-4bf8-465e-803a-bd2360fc3812" (UID: "12dac4d6-4bf8-465e-803a-bd2360fc3812"). InnerVolumeSpecName "kube-api-access-qj6hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.019300 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 22:06:36 crc kubenswrapper[4747]: E1205 22:06:36.019954 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dac4d6-4bf8-465e-803a-bd2360fc3812" containerName="mariadb-client-5-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.019980 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dac4d6-4bf8-465e-803a-bd2360fc3812" containerName="mariadb-client-5-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.022699 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dac4d6-4bf8-465e-803a-bd2360fc3812" containerName="mariadb-client-5-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.023370 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.034115 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.060233 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpmj\" (UniqueName: \"kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj\") pod \"mariadb-client-6-default\" (UID: \"96811bc4-ea56-4420-afc3-35718a6e230b\") " pod="openstack/mariadb-client-6-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.060456 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6hm\" (UniqueName: \"kubernetes.io/projected/12dac4d6-4bf8-465e-803a-bd2360fc3812-kube-api-access-qj6hm\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.161985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpmj\" (UniqueName: \"kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj\") pod \"mariadb-client-6-default\" (UID: \"96811bc4-ea56-4420-afc3-35718a6e230b\") " pod="openstack/mariadb-client-6-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.185985 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpmj\" (UniqueName: \"kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj\") pod \"mariadb-client-6-default\" (UID: \"96811bc4-ea56-4420-afc3-35718a6e230b\") " pod="openstack/mariadb-client-6-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.344550 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.408944 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="062bf19dddacdb734da9002520f4e03b8093248c730aa100b2e04927a50518f8" Dec 05 22:06:36 crc kubenswrapper[4747]: I1205 22:06:36.409066 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 22:06:37 crc kubenswrapper[4747]: I1205 22:06:36.954986 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 22:06:37 crc kubenswrapper[4747]: W1205 22:06:36.957447 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96811bc4_ea56_4420_afc3_35718a6e230b.slice/crio-1033d7219ea1a6c876ecb7770c9f1a5f40e46271eb3e0493d3221d2453a2f0dd WatchSource:0}: Error finding container 1033d7219ea1a6c876ecb7770c9f1a5f40e46271eb3e0493d3221d2453a2f0dd: Status 404 returned error can't find the container with id 1033d7219ea1a6c876ecb7770c9f1a5f40e46271eb3e0493d3221d2453a2f0dd Dec 05 22:06:37 crc kubenswrapper[4747]: I1205 22:06:37.418683 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"96811bc4-ea56-4420-afc3-35718a6e230b","Type":"ContainerStarted","Data":"0e08484e7c1803a70fd8d969f5942b169b5d041efb726c9885cca85f7a798521"} Dec 05 22:06:37 crc kubenswrapper[4747]: I1205 22:06:37.418736 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"96811bc4-ea56-4420-afc3-35718a6e230b","Type":"ContainerStarted","Data":"1033d7219ea1a6c876ecb7770c9f1a5f40e46271eb3e0493d3221d2453a2f0dd"} Dec 05 22:06:37 crc kubenswrapper[4747]: I1205 22:06:37.449153 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.449127938 podStartE2EDuration="1.449127938s" podCreationTimestamp="2025-12-05 22:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:06:37.434618845 +0000 UTC m=+5067.901926343" watchObservedRunningTime="2025-12-05 22:06:37.449127938 +0000 UTC m=+5067.916435446" Dec 05 22:06:37 crc kubenswrapper[4747]: I1205 22:06:37.853103 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12dac4d6-4bf8-465e-803a-bd2360fc3812" path="/var/lib/kubelet/pods/12dac4d6-4bf8-465e-803a-bd2360fc3812/volumes" Dec 05 22:06:38 crc kubenswrapper[4747]: I1205 22:06:38.432742 4747 generic.go:334] "Generic (PLEG): container finished" podID="96811bc4-ea56-4420-afc3-35718a6e230b" containerID="0e08484e7c1803a70fd8d969f5942b169b5d041efb726c9885cca85f7a798521" exitCode=1 Dec 05 22:06:38 crc kubenswrapper[4747]: I1205 22:06:38.432847 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"96811bc4-ea56-4420-afc3-35718a6e230b","Type":"ContainerDied","Data":"0e08484e7c1803a70fd8d969f5942b169b5d041efb726c9885cca85f7a798521"} Dec 05 22:06:39 crc kubenswrapper[4747]: I1205 22:06:39.816456 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 22:06:39 crc kubenswrapper[4747]: I1205 22:06:39.855791 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 22:06:39 crc kubenswrapper[4747]: I1205 22:06:39.860343 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 22:06:39 crc kubenswrapper[4747]: I1205 22:06:39.925524 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfpmj\" (UniqueName: \"kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj\") pod \"96811bc4-ea56-4420-afc3-35718a6e230b\" (UID: \"96811bc4-ea56-4420-afc3-35718a6e230b\") " Dec 05 22:06:39 crc kubenswrapper[4747]: I1205 22:06:39.932883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj" (OuterVolumeSpecName: "kube-api-access-sfpmj") pod "96811bc4-ea56-4420-afc3-35718a6e230b" (UID: "96811bc4-ea56-4420-afc3-35718a6e230b"). InnerVolumeSpecName "kube-api-access-sfpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.011470 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 22:06:40 crc kubenswrapper[4747]: E1205 22:06:40.011830 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96811bc4-ea56-4420-afc3-35718a6e230b" containerName="mariadb-client-6-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.011846 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="96811bc4-ea56-4420-afc3-35718a6e230b" containerName="mariadb-client-6-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.012051 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="96811bc4-ea56-4420-afc3-35718a6e230b" containerName="mariadb-client-6-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.012628 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.018733 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.028258 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfpmj\" (UniqueName: \"kubernetes.io/projected/96811bc4-ea56-4420-afc3-35718a6e230b-kube-api-access-sfpmj\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.129899 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56zr\" (UniqueName: \"kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr\") pod \"mariadb-client-7-default\" (UID: \"2d18ca0e-a9b4-43a4-8001-bc25204dc750\") " pod="openstack/mariadb-client-7-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.231851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56zr\" (UniqueName: \"kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr\") pod \"mariadb-client-7-default\" (UID: \"2d18ca0e-a9b4-43a4-8001-bc25204dc750\") " pod="openstack/mariadb-client-7-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.254419 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56zr\" (UniqueName: \"kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr\") pod \"mariadb-client-7-default\" (UID: \"2d18ca0e-a9b4-43a4-8001-bc25204dc750\") " pod="openstack/mariadb-client-7-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.331906 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.478761 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1033d7219ea1a6c876ecb7770c9f1a5f40e46271eb3e0493d3221d2453a2f0dd" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.478842 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 22:06:40 crc kubenswrapper[4747]: I1205 22:06:40.951152 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 22:06:40 crc kubenswrapper[4747]: W1205 22:06:40.955539 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d18ca0e_a9b4_43a4_8001_bc25204dc750.slice/crio-48c1da5a443f528d5883f733426173ad6ca93579717b1cd8979ad1aa94e861f9 WatchSource:0}: Error finding container 48c1da5a443f528d5883f733426173ad6ca93579717b1cd8979ad1aa94e861f9: Status 404 returned error can't find the container with id 48c1da5a443f528d5883f733426173ad6ca93579717b1cd8979ad1aa94e861f9 Dec 05 22:06:41 crc kubenswrapper[4747]: I1205 22:06:41.491150 4747 generic.go:334] "Generic (PLEG): container finished" podID="2d18ca0e-a9b4-43a4-8001-bc25204dc750" containerID="881fcc9e74b76ad3f351346266c23852ba30c4bccbe891c838469f0431a10650" exitCode=0 Dec 05 22:06:41 crc kubenswrapper[4747]: I1205 22:06:41.491197 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"2d18ca0e-a9b4-43a4-8001-bc25204dc750","Type":"ContainerDied","Data":"881fcc9e74b76ad3f351346266c23852ba30c4bccbe891c838469f0431a10650"} Dec 05 22:06:41 crc kubenswrapper[4747]: I1205 22:06:41.491225 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"2d18ca0e-a9b4-43a4-8001-bc25204dc750","Type":"ContainerStarted","Data":"48c1da5a443f528d5883f733426173ad6ca93579717b1cd8979ad1aa94e861f9"} Dec 05 22:06:41 crc kubenswrapper[4747]: I1205 22:06:41.857674 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96811bc4-ea56-4420-afc3-35718a6e230b" path="/var/lib/kubelet/pods/96811bc4-ea56-4420-afc3-35718a6e230b/volumes" Dec 05 22:06:42 crc kubenswrapper[4747]: I1205 22:06:42.839605 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:06:42 crc kubenswrapper[4747]: E1205 22:06:42.839949 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:06:42 crc kubenswrapper[4747]: I1205 22:06:42.913952 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 22:06:42 crc kubenswrapper[4747]: I1205 22:06:42.938377 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_2d18ca0e-a9b4-43a4-8001-bc25204dc750/mariadb-client-7-default/0.log" Dec 05 22:06:42 crc kubenswrapper[4747]: I1205 22:06:42.966114 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 22:06:42 crc kubenswrapper[4747]: I1205 22:06:42.977085 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.075986 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56zr\" (UniqueName: \"kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr\") pod \"2d18ca0e-a9b4-43a4-8001-bc25204dc750\" (UID: \"2d18ca0e-a9b4-43a4-8001-bc25204dc750\") " Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.096479 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 05 22:06:43 crc kubenswrapper[4747]: E1205 22:06:43.096827 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d18ca0e-a9b4-43a4-8001-bc25204dc750" containerName="mariadb-client-7-default" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.096847 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d18ca0e-a9b4-43a4-8001-bc25204dc750" containerName="mariadb-client-7-default" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.097098 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d18ca0e-a9b4-43a4-8001-bc25204dc750" containerName="mariadb-client-7-default" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.097548 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.101434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr" (OuterVolumeSpecName: "kube-api-access-q56zr") pod "2d18ca0e-a9b4-43a4-8001-bc25204dc750" (UID: "2d18ca0e-a9b4-43a4-8001-bc25204dc750"). InnerVolumeSpecName "kube-api-access-q56zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.113757 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.178909 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56zr\" (UniqueName: \"kubernetes.io/projected/2d18ca0e-a9b4-43a4-8001-bc25204dc750-kube-api-access-q56zr\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.280542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2rp\" (UniqueName: \"kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp\") pod \"mariadb-client-2\" (UID: \"beb98266-a4d9-46e1-884d-1317c4531e66\") " pod="openstack/mariadb-client-2" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.382818 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2rp\" (UniqueName: \"kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp\") pod \"mariadb-client-2\" (UID: \"beb98266-a4d9-46e1-884d-1317c4531e66\") " pod="openstack/mariadb-client-2" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.419333 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2rp\" (UniqueName: \"kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp\") pod \"mariadb-client-2\" (UID: \"beb98266-a4d9-46e1-884d-1317c4531e66\") " pod="openstack/mariadb-client-2" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.453019 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.511750 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c1da5a443f528d5883f733426173ad6ca93579717b1cd8979ad1aa94e861f9" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.511844 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.745769 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 22:06:43 crc kubenswrapper[4747]: W1205 22:06:43.775334 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeb98266_a4d9_46e1_884d_1317c4531e66.slice/crio-981c309432b7e85af3dcb1a946990ad8e964477f10d409bb1720cb3708447c84 WatchSource:0}: Error finding container 981c309432b7e85af3dcb1a946990ad8e964477f10d409bb1720cb3708447c84: Status 404 returned error can't find the container with id 981c309432b7e85af3dcb1a946990ad8e964477f10d409bb1720cb3708447c84 Dec 05 22:06:43 crc kubenswrapper[4747]: I1205 22:06:43.852005 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d18ca0e-a9b4-43a4-8001-bc25204dc750" path="/var/lib/kubelet/pods/2d18ca0e-a9b4-43a4-8001-bc25204dc750/volumes" Dec 05 22:06:44 crc kubenswrapper[4747]: I1205 22:06:44.521375 4747 generic.go:334] "Generic (PLEG): container finished" podID="beb98266-a4d9-46e1-884d-1317c4531e66" containerID="2c32f3b8108412e754e83dba22532ece02fcd8dde771983f1aee6f529357d771" exitCode=0 Dec 05 22:06:44 crc kubenswrapper[4747]: I1205 22:06:44.521443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"beb98266-a4d9-46e1-884d-1317c4531e66","Type":"ContainerDied","Data":"2c32f3b8108412e754e83dba22532ece02fcd8dde771983f1aee6f529357d771"} Dec 05 22:06:44 crc kubenswrapper[4747]: I1205 22:06:44.521516 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"beb98266-a4d9-46e1-884d-1317c4531e66","Type":"ContainerStarted","Data":"981c309432b7e85af3dcb1a946990ad8e964477f10d409bb1720cb3708447c84"} Dec 05 22:06:45 crc kubenswrapper[4747]: I1205 22:06:45.945341 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 22:06:45 crc kubenswrapper[4747]: I1205 22:06:45.965757 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_beb98266-a4d9-46e1-884d-1317c4531e66/mariadb-client-2/0.log" Dec 05 22:06:45 crc kubenswrapper[4747]: I1205 22:06:45.997463 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.003965 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.051424 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2rp\" (UniqueName: \"kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp\") pod \"beb98266-a4d9-46e1-884d-1317c4531e66\" (UID: \"beb98266-a4d9-46e1-884d-1317c4531e66\") " Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.065935 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp" (OuterVolumeSpecName: "kube-api-access-tw2rp") pod "beb98266-a4d9-46e1-884d-1317c4531e66" (UID: "beb98266-a4d9-46e1-884d-1317c4531e66"). InnerVolumeSpecName "kube-api-access-tw2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.152740 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2rp\" (UniqueName: \"kubernetes.io/projected/beb98266-a4d9-46e1-884d-1317c4531e66-kube-api-access-tw2rp\") on node \"crc\" DevicePath \"\"" Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.541143 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981c309432b7e85af3dcb1a946990ad8e964477f10d409bb1720cb3708447c84" Dec 05 22:06:46 crc kubenswrapper[4747]: I1205 22:06:46.541192 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 22:06:47 crc kubenswrapper[4747]: I1205 22:06:47.876503 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb98266-a4d9-46e1-884d-1317c4531e66" path="/var/lib/kubelet/pods/beb98266-a4d9-46e1-884d-1317c4531e66/volumes" Dec 05 22:06:55 crc kubenswrapper[4747]: I1205 22:06:55.840727 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:06:55 crc kubenswrapper[4747]: E1205 22:06:55.841552 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:07:06 crc kubenswrapper[4747]: I1205 22:07:06.839670 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:07:06 crc kubenswrapper[4747]: E1205 22:07:06.840500 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:07:17 crc kubenswrapper[4747]: I1205 22:07:17.839832 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:07:17 crc kubenswrapper[4747]: E1205 22:07:17.841089 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:07:29 crc kubenswrapper[4747]: I1205 22:07:29.853742 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:07:29 crc kubenswrapper[4747]: E1205 22:07:29.854654 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:07:40 crc kubenswrapper[4747]: I1205 22:07:40.840102 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:07:40 crc kubenswrapper[4747]: E1205 22:07:40.840960 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:07:52 crc kubenswrapper[4747]: I1205 22:07:52.840676 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:07:52 crc kubenswrapper[4747]: E1205 22:07:52.841680 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:08:07 crc kubenswrapper[4747]: I1205 22:08:07.840523 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:08:07 crc kubenswrapper[4747]: E1205 22:08:07.841567 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:08:18 crc kubenswrapper[4747]: I1205 22:08:18.840274 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:08:18 crc kubenswrapper[4747]: E1205 22:08:18.842172 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:08:28 crc kubenswrapper[4747]: I1205 22:08:28.850111 4747 scope.go:117] "RemoveContainer" containerID="7f3bc66e03b310691bf30ab718ddb49a687d78adcffda825bb1653b8ac7c51fe" Dec 05 22:08:31 crc kubenswrapper[4747]: I1205 22:08:31.839366 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:08:31 crc kubenswrapper[4747]: E1205 22:08:31.839830 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:08:43 crc kubenswrapper[4747]: I1205 22:08:43.841395 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:08:43 crc kubenswrapper[4747]: E1205 22:08:43.842640 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:08:56 crc kubenswrapper[4747]: I1205 22:08:56.840250 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:08:56 crc kubenswrapper[4747]: E1205 22:08:56.841063 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:09:10 crc kubenswrapper[4747]: I1205 22:09:10.839995 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:09:10 crc kubenswrapper[4747]: E1205 22:09:10.841191 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:09:25 crc kubenswrapper[4747]: I1205 22:09:25.840392 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:09:25 crc kubenswrapper[4747]: E1205 22:09:25.841500 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:09:37 crc kubenswrapper[4747]: I1205 22:09:37.840188 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:09:37 crc kubenswrapper[4747]: E1205 22:09:37.841540 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:09:50 crc kubenswrapper[4747]: I1205 22:09:50.839644 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:09:50 crc kubenswrapper[4747]: E1205 22:09:50.840635 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:10:01 crc kubenswrapper[4747]: I1205 22:10:01.839992 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:10:01 crc kubenswrapper[4747]: E1205 22:10:01.840823 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.133040 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 22:10:06 crc kubenswrapper[4747]: E1205 22:10:06.133385 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb98266-a4d9-46e1-884d-1317c4531e66" containerName="mariadb-client-2" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.133402 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb98266-a4d9-46e1-884d-1317c4531e66" containerName="mariadb-client-2" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.133631 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb98266-a4d9-46e1-884d-1317c4531e66" containerName="mariadb-client-2" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.134213 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.136651 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7jmpv" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.153525 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.177260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzng\" (UniqueName: \"kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.177531 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.279264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.279551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzng\" (UniqueName: \"kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.283203 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.283270 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de24e65e7ec40f813a9afa7e7a04d7d273f25c3f190dfe181afa503a0c4fd56a/globalmount\"" pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.305901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzng\" (UniqueName: \"kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.323869 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") pod \"mariadb-copy-data\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.453563 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 22:10:06 crc kubenswrapper[4747]: I1205 22:10:06.800114 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 22:10:07 crc kubenswrapper[4747]: I1205 22:10:07.513157 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b7e91ce7-cd06-4b55-8544-096e74925dca","Type":"ContainerStarted","Data":"c54f925012dec54cc1d3123accfc64c0f8edc1343eb74cf1f0f2dcde19a4071d"} Dec 05 22:10:07 crc kubenswrapper[4747]: I1205 22:10:07.513624 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b7e91ce7-cd06-4b55-8544-096e74925dca","Type":"ContainerStarted","Data":"a3d07c686820eec840fa521463955d6781c73cfe9952da1c8e7045058955f956"} Dec 05 22:10:07 crc kubenswrapper[4747]: I1205 22:10:07.552838 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.552809897 podStartE2EDuration="2.552809897s" podCreationTimestamp="2025-12-05 22:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:07.538071089 +0000 UTC m=+5278.005378617" watchObservedRunningTime="2025-12-05 22:10:07.552809897 +0000 UTC m=+5278.020117425" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.225680 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.227092 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.231646 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.339668 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7qtw\" (UniqueName: \"kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw\") pod \"mariadb-client\" (UID: \"f4caa599-38e2-4ffe-adca-b386d389a3c3\") " pod="openstack/mariadb-client" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.441327 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7qtw\" (UniqueName: \"kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw\") pod \"mariadb-client\" (UID: \"f4caa599-38e2-4ffe-adca-b386d389a3c3\") " pod="openstack/mariadb-client" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.474526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7qtw\" (UniqueName: \"kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw\") pod \"mariadb-client\" (UID: \"f4caa599-38e2-4ffe-adca-b386d389a3c3\") " pod="openstack/mariadb-client" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.563417 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:10 crc kubenswrapper[4747]: I1205 22:10:10.997163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:11 crc kubenswrapper[4747]: W1205 22:10:11.005668 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4caa599_38e2_4ffe_adca_b386d389a3c3.slice/crio-691a5a27eb02d9a5efe05f5953e3fa8d5b2b7c9996aa11418891a8b89a8a339a WatchSource:0}: Error finding container 691a5a27eb02d9a5efe05f5953e3fa8d5b2b7c9996aa11418891a8b89a8a339a: Status 404 returned error can't find the container with id 691a5a27eb02d9a5efe05f5953e3fa8d5b2b7c9996aa11418891a8b89a8a339a Dec 05 22:10:11 crc kubenswrapper[4747]: I1205 22:10:11.559195 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4caa599-38e2-4ffe-adca-b386d389a3c3" containerID="d8acb77392364a05b83238efcea7f01c5f41fd9ab659abc531a47440e7e20951" exitCode=0 Dec 05 22:10:11 crc kubenswrapper[4747]: I1205 22:10:11.559305 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f4caa599-38e2-4ffe-adca-b386d389a3c3","Type":"ContainerDied","Data":"d8acb77392364a05b83238efcea7f01c5f41fd9ab659abc531a47440e7e20951"} Dec 05 22:10:11 crc kubenswrapper[4747]: I1205 22:10:11.559525 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f4caa599-38e2-4ffe-adca-b386d389a3c3","Type":"ContainerStarted","Data":"691a5a27eb02d9a5efe05f5953e3fa8d5b2b7c9996aa11418891a8b89a8a339a"} Dec 05 22:10:12 crc kubenswrapper[4747]: I1205 22:10:12.843068 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:10:12 crc kubenswrapper[4747]: E1205 22:10:12.843507 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:10:12 crc kubenswrapper[4747]: I1205 22:10:12.940202 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:12 crc kubenswrapper[4747]: I1205 22:10:12.960020 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_f4caa599-38e2-4ffe-adca-b386d389a3c3/mariadb-client/0.log" Dec 05 22:10:12 crc kubenswrapper[4747]: I1205 22:10:12.990498 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:12 crc kubenswrapper[4747]: I1205 22:10:12.995870 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.107891 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.108228 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7qtw\" (UniqueName: \"kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw\") pod \"f4caa599-38e2-4ffe-adca-b386d389a3c3\" (UID: \"f4caa599-38e2-4ffe-adca-b386d389a3c3\") " Dec 05 22:10:13 crc kubenswrapper[4747]: E1205 22:10:13.108344 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4caa599-38e2-4ffe-adca-b386d389a3c3" containerName="mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.108366 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4caa599-38e2-4ffe-adca-b386d389a3c3" containerName="mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.108572 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4caa599-38e2-4ffe-adca-b386d389a3c3" containerName="mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.109227 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.118984 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.126434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw" (OuterVolumeSpecName: "kube-api-access-b7qtw") pod "f4caa599-38e2-4ffe-adca-b386d389a3c3" (UID: "f4caa599-38e2-4ffe-adca-b386d389a3c3"). InnerVolumeSpecName "kube-api-access-b7qtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.210681 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7qtw\" (UniqueName: \"kubernetes.io/projected/f4caa599-38e2-4ffe-adca-b386d389a3c3-kube-api-access-b7qtw\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.312434 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8bk\" (UniqueName: \"kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk\") pod \"mariadb-client\" (UID: \"f35ff3c5-a813-42f7-840b-0bc486166b84\") " pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.414911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8bk\" (UniqueName: \"kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk\") pod \"mariadb-client\" (UID: \"f35ff3c5-a813-42f7-840b-0bc486166b84\") " pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.440324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8bk\" (UniqueName: \"kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk\") pod \"mariadb-client\" (UID: \"f35ff3c5-a813-42f7-840b-0bc486166b84\") " pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.466643 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.594293 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="691a5a27eb02d9a5efe05f5953e3fa8d5b2b7c9996aa11418891a8b89a8a339a" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.594347 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.621280 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="f4caa599-38e2-4ffe-adca-b386d389a3c3" podUID="f35ff3c5-a813-42f7-840b-0bc486166b84" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.855775 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4caa599-38e2-4ffe-adca-b386d389a3c3" path="/var/lib/kubelet/pods/f4caa599-38e2-4ffe-adca-b386d389a3c3/volumes" Dec 05 22:10:13 crc kubenswrapper[4747]: I1205 22:10:13.925843 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:14 crc kubenswrapper[4747]: I1205 22:10:14.606430 4747 generic.go:334] "Generic (PLEG): container finished" podID="f35ff3c5-a813-42f7-840b-0bc486166b84" containerID="eacb542e8f2034bdcddaec2929d18cd2cab1939319d1010cc83108837e6d726f" exitCode=0 Dec 05 22:10:14 crc kubenswrapper[4747]: I1205 22:10:14.606524 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f35ff3c5-a813-42f7-840b-0bc486166b84","Type":"ContainerDied","Data":"eacb542e8f2034bdcddaec2929d18cd2cab1939319d1010cc83108837e6d726f"} Dec 05 22:10:14 crc kubenswrapper[4747]: I1205 22:10:14.606779 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f35ff3c5-a813-42f7-840b-0bc486166b84","Type":"ContainerStarted","Data":"2e1cfa41fff39902a6792a99ce5831b97ebb80472b37f1a6bc48b96aa5219ae4"} Dec 05 22:10:15 crc kubenswrapper[4747]: I1205 22:10:15.940717 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:15 crc kubenswrapper[4747]: I1205 22:10:15.964687 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_f35ff3c5-a813-42f7-840b-0bc486166b84/mariadb-client/0.log" Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.003357 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.015628 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.054868 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z8bk\" (UniqueName: \"kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk\") pod \"f35ff3c5-a813-42f7-840b-0bc486166b84\" (UID: \"f35ff3c5-a813-42f7-840b-0bc486166b84\") " Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.060815 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk" (OuterVolumeSpecName: "kube-api-access-7z8bk") pod "f35ff3c5-a813-42f7-840b-0bc486166b84" (UID: "f35ff3c5-a813-42f7-840b-0bc486166b84"). InnerVolumeSpecName "kube-api-access-7z8bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.156711 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z8bk\" (UniqueName: \"kubernetes.io/projected/f35ff3c5-a813-42f7-840b-0bc486166b84-kube-api-access-7z8bk\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.626389 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1cfa41fff39902a6792a99ce5831b97ebb80472b37f1a6bc48b96aa5219ae4" Dec 05 22:10:16 crc kubenswrapper[4747]: I1205 22:10:16.626458 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 22:10:17 crc kubenswrapper[4747]: I1205 22:10:17.858800 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f35ff3c5-a813-42f7-840b-0bc486166b84" path="/var/lib/kubelet/pods/f35ff3c5-a813-42f7-840b-0bc486166b84/volumes" Dec 05 22:10:27 crc kubenswrapper[4747]: I1205 22:10:27.839729 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:10:27 crc kubenswrapper[4747]: E1205 22:10:27.841590 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:10:42 crc kubenswrapper[4747]: I1205 22:10:42.839510 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:10:43 crc kubenswrapper[4747]: E1205 22:10:43.571011 4747 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.22:36928->38.102.83.22:39817: read tcp 38.102.83.22:36928->38.102.83.22:39817: read: connection reset by peer Dec 05 22:10:43 crc kubenswrapper[4747]: I1205 22:10:43.862918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d"} Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.967010 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 22:10:48 crc kubenswrapper[4747]: E1205 22:10:48.969517 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35ff3c5-a813-42f7-840b-0bc486166b84" containerName="mariadb-client" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.969631 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35ff3c5-a813-42f7-840b-0bc486166b84" containerName="mariadb-client" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.970205 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35ff3c5-a813-42f7-840b-0bc486166b84" containerName="mariadb-client" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.972159 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.976817 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.976968 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.977097 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m44ck" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.977290 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.977382 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.977517 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.982287 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.983833 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.996142 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 22:10:48 crc kubenswrapper[4747]: I1205 22:10:48.999279 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.006760 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.013385 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.089890 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.089946 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.089975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-config\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.089994 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090184 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090291 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c998b3a4-4850-463a-b074-4eb133f3b890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c998b3a4-4850-463a-b074-4eb133f3b890\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090322 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090345 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f91e4355-8711-4c53-9c55-273033340752-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090365 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090415 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090553 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090607 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090670 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjdn\" (UniqueName: \"kubernetes.io/projected/f081a03e-60c9-41c7-8b24-255114a2998e-kube-api-access-wtjdn\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090697 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7h6\" (UniqueName: \"kubernetes.io/projected/5c17dd19-c351-4e05-b7be-ff471da69382-kube-api-access-2w7h6\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090817 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090918 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.090996 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.091027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bzn\" (UniqueName: \"kubernetes.io/projected/f91e4355-8711-4c53-9c55-273033340752-kube-api-access-m2bzn\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.091052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-config\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.091129 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.091169 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bzn\" (UniqueName: \"kubernetes.io/projected/f91e4355-8711-4c53-9c55-273033340752-kube-api-access-m2bzn\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-config\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193391 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193472 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-config\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193496 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193549 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193606 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c998b3a4-4850-463a-b074-4eb133f3b890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c998b3a4-4850-463a-b074-4eb133f3b890\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193631 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193654 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f91e4355-8711-4c53-9c55-273033340752-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193698 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193721 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193756 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjdn\" (UniqueName: \"kubernetes.io/projected/f081a03e-60c9-41c7-8b24-255114a2998e-kube-api-access-wtjdn\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193834 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7h6\" (UniqueName: \"kubernetes.io/projected/5c17dd19-c351-4e05-b7be-ff471da69382-kube-api-access-2w7h6\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193875 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193918 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.193967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.194832 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-config\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.194950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.195424 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.195532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f91e4355-8711-4c53-9c55-273033340752-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.195765 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.198210 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-config\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.198617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081a03e-60c9-41c7-8b24-255114a2998e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.198634 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91e4355-8711-4c53-9c55-273033340752-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.200384 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c17dd19-c351-4e05-b7be-ff471da69382-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.200648 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.200711 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/add53879a899b64fe04d83bef405af6e4abb83fabc33119ec2c61e18144bed76/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.201224 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.201339 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c998b3a4-4850-463a-b074-4eb133f3b890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c998b3a4-4850-463a-b074-4eb133f3b890\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75d31c0866b3f77ad6e827787f00e61311b7d131bfa6908d17ab937beff81727/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.201725 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.201735 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.201778 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e7a23eb969081800aa7959c031703d210ab5a68a99c44d9a179d8f66b5f2db7d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.203236 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.205013 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.207270 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.208075 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.211051 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.211757 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c17dd19-c351-4e05-b7be-ff471da69382-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.213986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f081a03e-60c9-41c7-8b24-255114a2998e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.214089 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bzn\" (UniqueName: \"kubernetes.io/projected/f91e4355-8711-4c53-9c55-273033340752-kube-api-access-m2bzn\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.214355 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91e4355-8711-4c53-9c55-273033340752-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.215383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7h6\" (UniqueName: \"kubernetes.io/projected/5c17dd19-c351-4e05-b7be-ff471da69382-kube-api-access-2w7h6\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.216005 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjdn\" (UniqueName: \"kubernetes.io/projected/f081a03e-60c9-41c7-8b24-255114a2998e-kube-api-access-wtjdn\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.236529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe0e67a8-341b-4c86-b374-a14aa88e6fb6\") pod \"ovsdbserver-sb-1\" (UID: \"f91e4355-8711-4c53-9c55-273033340752\") " pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.236644 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c998b3a4-4850-463a-b074-4eb133f3b890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c998b3a4-4850-463a-b074-4eb133f3b890\") pod \"ovsdbserver-sb-2\" (UID: \"5c17dd19-c351-4e05-b7be-ff471da69382\") " pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.251049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae10fd71-5bf5-4775-93bf-e7750f30439a\") pod \"ovsdbserver-sb-0\" (UID: \"f081a03e-60c9-41c7-8b24-255114a2998e\") " pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.334320 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.342643 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.345076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.349408 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.349857 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fxbc9" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.350252 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.351728 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.354327 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.360736 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.361075 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.384531 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.386107 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.393488 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.394747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.396839 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.396913 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.396941 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.396958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.397034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.397087 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.397116 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.397131 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hswcf\" (UniqueName: \"kubernetes.io/projected/e95df327-27e4-4201-b4b9-c8f55bdff2a4-kube-api-access-hswcf\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.407360 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.419638 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498421 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498464 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498487 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498512 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzf6\" (UniqueName: \"kubernetes.io/projected/cebaf6d4-544c-4d13-beda-2c5231a92d0c-kube-api-access-sxzf6\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498617 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498645 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498667 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498686 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498741 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498767 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498882 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498898 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498920 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-config\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498941 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.498981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499004 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrqq\" (UniqueName: \"kubernetes.io/projected/bcb13537-396f-464d-8bf2-22a7cf04c8bc-kube-api-access-gcrqq\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499042 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499062 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499079 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.499096 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hswcf\" (UniqueName: \"kubernetes.io/projected/e95df327-27e4-4201-b4b9-c8f55bdff2a4-kube-api-access-hswcf\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.500198 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.500574 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95df327-27e4-4201-b4b9-c8f55bdff2a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.501473 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.503683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.503760 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.503784 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ed0804dc2ec61a2e06ee15cf20d09588661de7511548ce1e8de67f4cce0fc85d/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.505439 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.506873 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95df327-27e4-4201-b4b9-c8f55bdff2a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.523638 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hswcf\" (UniqueName: \"kubernetes.io/projected/e95df327-27e4-4201-b4b9-c8f55bdff2a4-kube-api-access-hswcf\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.550272 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f13a0c0-91c9-4b69-aaf1-b5778d023e81\") pod \"ovsdbserver-nb-0\" (UID: \"e95df327-27e4-4201-b4b9-c8f55bdff2a4\") " pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600757 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600841 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600866 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600903 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.600942 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-config\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601000 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrqq\" (UniqueName: \"kubernetes.io/projected/bcb13537-396f-464d-8bf2-22a7cf04c8bc-kube-api-access-gcrqq\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601083 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzf6\" (UniqueName: \"kubernetes.io/projected/cebaf6d4-544c-4d13-beda-2c5231a92d0c-kube-api-access-sxzf6\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601312 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601343 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.601870 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.603048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.603374 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-config\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.603574 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.603593 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb13537-396f-464d-8bf2-22a7cf04c8bc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.604013 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.605735 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.605761 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dfbed52809d342154ca24a5d9a694a9da8c14bddced4a88f23c5c81f2c354389/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.605939 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.605964 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e79d047414912d4ee591511684801711b8249c94ff4385bbc1231ebeb418798f/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.606045 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cebaf6d4-544c-4d13-beda-2c5231a92d0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.606605 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.607052 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.607198 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.607951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb13537-396f-464d-8bf2-22a7cf04c8bc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.611093 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebaf6d4-544c-4d13-beda-2c5231a92d0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.620921 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrqq\" (UniqueName: \"kubernetes.io/projected/bcb13537-396f-464d-8bf2-22a7cf04c8bc-kube-api-access-gcrqq\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.625082 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzf6\" (UniqueName: \"kubernetes.io/projected/cebaf6d4-544c-4d13-beda-2c5231a92d0c-kube-api-access-sxzf6\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.630777 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b8e311c-1fd8-4819-b5a1-97fb5d9a317d\") pod \"ovsdbserver-nb-2\" (UID: \"cebaf6d4-544c-4d13-beda-2c5231a92d0c\") " pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.631329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46e1d46d-0622-4df5-9829-8c7a84858c61\") pod \"ovsdbserver-nb-1\" (UID: \"bcb13537-396f-464d-8bf2-22a7cf04c8bc\") " pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.683996 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.835022 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.846386 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:49 crc kubenswrapper[4747]: I1205 22:10:49.918377 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.033413 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.161012 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 22:10:50 crc kubenswrapper[4747]: W1205 22:10:50.166144 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode95df327_27e4_4201_b4b9_c8f55bdff2a4.slice/crio-5aada3c19c96c4a8524ec9331cae9d64a4e713b7cb0ce534053b7bc6e3eb53da WatchSource:0}: Error finding container 5aada3c19c96c4a8524ec9331cae9d64a4e713b7cb0ce534053b7bc6e3eb53da: Status 404 returned error can't find the container with id 5aada3c19c96c4a8524ec9331cae9d64a4e713b7cb0ce534053b7bc6e3eb53da Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.417491 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.696534 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.922920 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"bcb13537-396f-464d-8bf2-22a7cf04c8bc","Type":"ContainerStarted","Data":"a9fb816715a52d98bd791813c6dfd49ecdd458bd634e7d2d56ce9011d49c8070"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.923167 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"bcb13537-396f-464d-8bf2-22a7cf04c8bc","Type":"ContainerStarted","Data":"da35c9ae5c7a7f1335140a6710012b1a490504d2915e2dce5a86c5091a448864"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.923177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"bcb13537-396f-464d-8bf2-22a7cf04c8bc","Type":"ContainerStarted","Data":"06dcced4b64e52ee773764258f9f5ccbd75f991d809bcd94b29022a9bc146205"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.925190 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5c17dd19-c351-4e05-b7be-ff471da69382","Type":"ContainerStarted","Data":"d8560f5cf48e080a7bae6431b32c7962adac4894d12ac23f04d8b935e10102f6"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.925218 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5c17dd19-c351-4e05-b7be-ff471da69382","Type":"ContainerStarted","Data":"ed3ce110e30a121bf08ec7dfa33afe981406194b2a8a633487dd95ed8ce38e1a"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.925227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5c17dd19-c351-4e05-b7be-ff471da69382","Type":"ContainerStarted","Data":"524ba01bfdf60d74ddb6a5fb3df3ef3f63dd8973e252d7f45b36656ea2a17ed6"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.928684 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e95df327-27e4-4201-b4b9-c8f55bdff2a4","Type":"ContainerStarted","Data":"1094045eee936a886d87edcb513fd56c9421786771037d4c23e98c17afe1e525"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.928725 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e95df327-27e4-4201-b4b9-c8f55bdff2a4","Type":"ContainerStarted","Data":"62025d56a11db2d32d4deaa674c5ab6f417584cb441d6cb1b70b06d360b9dda3"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.928733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e95df327-27e4-4201-b4b9-c8f55bdff2a4","Type":"ContainerStarted","Data":"5aada3c19c96c4a8524ec9331cae9d64a4e713b7cb0ce534053b7bc6e3eb53da"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.931336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f91e4355-8711-4c53-9c55-273033340752","Type":"ContainerStarted","Data":"1b6413f6d9341fdcf6e24cbee69b54431b4c44088dff3f7eaddb3a77a405b691"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.931383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f91e4355-8711-4c53-9c55-273033340752","Type":"ContainerStarted","Data":"8953b5bacb8475a611a5856dbaa8ea3cc0ea7702c56c8a79051915c93be58e83"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.931399 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f91e4355-8711-4c53-9c55-273033340752","Type":"ContainerStarted","Data":"7791a1c3fb248b123fe1b6a80b9380b194e1dd580e6ec757510b8cb9b78d7e48"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.932560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f081a03e-60c9-41c7-8b24-255114a2998e","Type":"ContainerStarted","Data":"89a06536a91130300aa4d4aeac2ad364a27a6421e8d229998ecdcbfdb06f269a"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.932656 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f081a03e-60c9-41c7-8b24-255114a2998e","Type":"ContainerStarted","Data":"0f6cf65410362bd88f1acc975aba7a4f8a68552b6043830a344b744ec7882167"} Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.940333 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=2.940313778 podStartE2EDuration="2.940313778s" podCreationTimestamp="2025-12-05 22:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:50.938217865 +0000 UTC m=+5321.405525363" watchObservedRunningTime="2025-12-05 22:10:50.940313778 +0000 UTC m=+5321.407621266" Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.957175 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.957154778 podStartE2EDuration="3.957154778s" podCreationTimestamp="2025-12-05 22:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:50.956657236 +0000 UTC m=+5321.423964744" watchObservedRunningTime="2025-12-05 22:10:50.957154778 +0000 UTC m=+5321.424462266" Dec 05 22:10:50 crc kubenswrapper[4747]: I1205 22:10:50.976494 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.9764766209999998 podStartE2EDuration="3.976476621s" podCreationTimestamp="2025-12-05 22:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:50.971216159 +0000 UTC m=+5321.438523657" watchObservedRunningTime="2025-12-05 22:10:50.976476621 +0000 UTC m=+5321.443784119" Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.374271 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.374249296 podStartE2EDuration="3.374249296s" podCreationTimestamp="2025-12-05 22:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:50.989882346 +0000 UTC m=+5321.457189844" watchObservedRunningTime="2025-12-05 22:10:51.374249296 +0000 UTC m=+5321.841556794" Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.378992 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 22:10:51 crc kubenswrapper[4747]: W1205 22:10:51.386673 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcebaf6d4_544c_4d13_beda_2c5231a92d0c.slice/crio-6f20d4ceab9ab280c8ece9315173872ab5d0f272db5f1844901f6c6e261f509d WatchSource:0}: Error finding container 6f20d4ceab9ab280c8ece9315173872ab5d0f272db5f1844901f6c6e261f509d: Status 404 returned error can't find the container with id 6f20d4ceab9ab280c8ece9315173872ab5d0f272db5f1844901f6c6e261f509d Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.941146 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cebaf6d4-544c-4d13-beda-2c5231a92d0c","Type":"ContainerStarted","Data":"b9790c364c7e2e46e407c615fb1690d2f295f56fabefe7815c171862a02c4449"} Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.941441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cebaf6d4-544c-4d13-beda-2c5231a92d0c","Type":"ContainerStarted","Data":"fe75b3df0b64e6da7467995ae522fc64296488e44d8b6a9ac34945302528fdf7"} Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.941451 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cebaf6d4-544c-4d13-beda-2c5231a92d0c","Type":"ContainerStarted","Data":"6f20d4ceab9ab280c8ece9315173872ab5d0f272db5f1844901f6c6e261f509d"} Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.943970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f081a03e-60c9-41c7-8b24-255114a2998e","Type":"ContainerStarted","Data":"cf84c07022d06c9ab6493389758cea9cce1ef8acc78f59475e3dccb363163d00"} Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.967209 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.967192426 podStartE2EDuration="3.967192426s" podCreationTimestamp="2025-12-05 22:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:51.960636342 +0000 UTC m=+5322.427943920" watchObservedRunningTime="2025-12-05 22:10:51.967192426 +0000 UTC m=+5322.434499924" Dec 05 22:10:51 crc kubenswrapper[4747]: I1205 22:10:51.986792 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.986777505 podStartE2EDuration="4.986777505s" podCreationTimestamp="2025-12-05 22:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:51.984610861 +0000 UTC m=+5322.451918369" watchObservedRunningTime="2025-12-05 22:10:51.986777505 +0000 UTC m=+5322.454084983" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.335098 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.350856 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.362019 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.684932 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.835436 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:52 crc kubenswrapper[4747]: I1205 22:10:52.847316 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.335303 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.351516 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.362288 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.685195 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.835222 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:54 crc kubenswrapper[4747]: I1205 22:10:54.847489 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.412720 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.415006 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.430462 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.502141 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.502442 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.759644 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.828537 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.832250 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.833960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.840950 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.858792 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.888921 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.904976 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.933088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.933140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.933193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.933324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqcdz\" (UniqueName: \"kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:55 crc kubenswrapper[4747]: I1205 22:10:55.960334 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.035160 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.035201 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.035230 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.035301 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqcdz\" (UniqueName: \"kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.037398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.038077 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.038901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.051803 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.079394 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqcdz\" (UniqueName: \"kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz\") pod \"dnsmasq-dns-785c454b4c-kdj5b\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.169486 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.170043 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.191226 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.192762 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.194988 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.211136 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.241776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.241834 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.241868 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.241916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8vs\" (UniqueName: \"kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.241993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.354739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.354996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.355022 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.355037 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.355071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8vs\" (UniqueName: \"kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.357701 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.357724 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.358125 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.358222 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.373204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8vs\" (UniqueName: \"kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs\") pod \"dnsmasq-dns-5d99ccddc9-xvc75\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.581014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:56 crc kubenswrapper[4747]: W1205 22:10:56.711397 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd450e5e6_4a2a_49f0_8a24_e637087c6fe6.slice/crio-b6cae265e11d1024bd065c237738aaea3c99c26af160e52b2dff67e4a068da3f WatchSource:0}: Error finding container b6cae265e11d1024bd065c237738aaea3c99c26af160e52b2dff67e4a068da3f: Status 404 returned error can't find the container with id b6cae265e11d1024bd065c237738aaea3c99c26af160e52b2dff67e4a068da3f Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.716888 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.988008 4747 generic.go:334] "Generic (PLEG): container finished" podID="d450e5e6-4a2a-49f0-8a24-e637087c6fe6" containerID="ea870b4c57dde696873a603ed7348d00aac1d3263f3c70253a8b98ba47152039" exitCode=0 Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.988108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" event={"ID":"d450e5e6-4a2a-49f0-8a24-e637087c6fe6","Type":"ContainerDied","Data":"ea870b4c57dde696873a603ed7348d00aac1d3263f3c70253a8b98ba47152039"} Dec 05 22:10:56 crc kubenswrapper[4747]: I1205 22:10:56.988351 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" event={"ID":"d450e5e6-4a2a-49f0-8a24-e637087c6fe6","Type":"ContainerStarted","Data":"b6cae265e11d1024bd065c237738aaea3c99c26af160e52b2dff67e4a068da3f"} Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.066693 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:10:57 crc kubenswrapper[4747]: W1205 22:10:57.068943 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720cb589_2deb_499c_a051_0af6e76968cf.slice/crio-a2d4b89b825fb84b5f86c7ef24896f6516f24c3937a8cd24e565a75a349c1bae WatchSource:0}: Error finding container a2d4b89b825fb84b5f86c7ef24896f6516f24c3937a8cd24e565a75a349c1bae: Status 404 returned error can't find the container with id a2d4b89b825fb84b5f86c7ef24896f6516f24c3937a8cd24e565a75a349c1bae Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.292844 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.371279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb\") pod \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.371972 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config\") pod \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.372201 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqcdz\" (UniqueName: \"kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz\") pod \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.372266 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc\") pod \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\" (UID: \"d450e5e6-4a2a-49f0-8a24-e637087c6fe6\") " Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.376521 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz" (OuterVolumeSpecName: "kube-api-access-mqcdz") pod "d450e5e6-4a2a-49f0-8a24-e637087c6fe6" (UID: "d450e5e6-4a2a-49f0-8a24-e637087c6fe6"). InnerVolumeSpecName "kube-api-access-mqcdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.395009 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d450e5e6-4a2a-49f0-8a24-e637087c6fe6" (UID: "d450e5e6-4a2a-49f0-8a24-e637087c6fe6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.407515 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config" (OuterVolumeSpecName: "config") pod "d450e5e6-4a2a-49f0-8a24-e637087c6fe6" (UID: "d450e5e6-4a2a-49f0-8a24-e637087c6fe6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.418072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d450e5e6-4a2a-49f0-8a24-e637087c6fe6" (UID: "d450e5e6-4a2a-49f0-8a24-e637087c6fe6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.474382 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.474655 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.474731 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqcdz\" (UniqueName: \"kubernetes.io/projected/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-kube-api-access-mqcdz\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:57 crc kubenswrapper[4747]: I1205 22:10:57.474839 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d450e5e6-4a2a-49f0-8a24-e637087c6fe6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.000997 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" event={"ID":"d450e5e6-4a2a-49f0-8a24-e637087c6fe6","Type":"ContainerDied","Data":"b6cae265e11d1024bd065c237738aaea3c99c26af160e52b2dff67e4a068da3f"} Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.001070 4747 scope.go:117] "RemoveContainer" containerID="ea870b4c57dde696873a603ed7348d00aac1d3263f3c70253a8b98ba47152039" Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.001135 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785c454b4c-kdj5b" Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.003578 4747 generic.go:334] "Generic (PLEG): container finished" podID="720cb589-2deb-499c-a051-0af6e76968cf" containerID="161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee" exitCode=0 Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.003733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" event={"ID":"720cb589-2deb-499c-a051-0af6e76968cf","Type":"ContainerDied","Data":"161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee"} Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.003767 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" event={"ID":"720cb589-2deb-499c-a051-0af6e76968cf","Type":"ContainerStarted","Data":"a2d4b89b825fb84b5f86c7ef24896f6516f24c3937a8cd24e565a75a349c1bae"} Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.114660 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:58 crc kubenswrapper[4747]: I1205 22:10:58.137876 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785c454b4c-kdj5b"] Dec 05 22:10:59 crc kubenswrapper[4747]: I1205 22:10:59.018606 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" event={"ID":"720cb589-2deb-499c-a051-0af6e76968cf","Type":"ContainerStarted","Data":"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af"} Dec 05 22:10:59 crc kubenswrapper[4747]: I1205 22:10:59.019020 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:10:59 crc kubenswrapper[4747]: I1205 22:10:59.041320 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" podStartSLOduration=3.041300773 podStartE2EDuration="3.041300773s" podCreationTimestamp="2025-12-05 22:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:10:59.040523734 +0000 UTC m=+5329.507831322" watchObservedRunningTime="2025-12-05 22:10:59.041300773 +0000 UTC m=+5329.508608271" Dec 05 22:10:59 crc kubenswrapper[4747]: I1205 22:10:59.861526 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d450e5e6-4a2a-49f0-8a24-e637087c6fe6" path="/var/lib/kubelet/pods/d450e5e6-4a2a-49f0-8a24-e637087c6fe6/volumes" Dec 05 22:10:59 crc kubenswrapper[4747]: I1205 22:10:59.913845 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.718374 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 05 22:11:02 crc kubenswrapper[4747]: E1205 22:11:02.719659 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d450e5e6-4a2a-49f0-8a24-e637087c6fe6" containerName="init" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.719679 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d450e5e6-4a2a-49f0-8a24-e637087c6fe6" containerName="init" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.720085 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d450e5e6-4a2a-49f0-8a24-e637087c6fe6" containerName="init" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.721057 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.723814 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.727551 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.780528 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.780701 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.780912 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdfxk\" (UniqueName: \"kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.882237 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.882331 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdfxk\" (UniqueName: \"kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.882357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.886010 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.886252 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bf8597dd5a164755751a94e2e653a8d820895a06b62de37542c0b5dd6e3a0b33/globalmount\"" pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.888395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.910793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdfxk\" (UniqueName: \"kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:02 crc kubenswrapper[4747]: I1205 22:11:02.920899 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") pod \"ovn-copy-data\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " pod="openstack/ovn-copy-data" Dec 05 22:11:03 crc kubenswrapper[4747]: I1205 22:11:03.050040 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 22:11:03 crc kubenswrapper[4747]: W1205 22:11:03.642817 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8133d84_9dd2_4780_9414_ae4e7b78884d.slice/crio-dba6ecf5702fd22446b3056fd9564f54c4443e9830a6f0aea4b2a2df23c606c4 WatchSource:0}: Error finding container dba6ecf5702fd22446b3056fd9564f54c4443e9830a6f0aea4b2a2df23c606c4: Status 404 returned error can't find the container with id dba6ecf5702fd22446b3056fd9564f54c4443e9830a6f0aea4b2a2df23c606c4 Dec 05 22:11:03 crc kubenswrapper[4747]: I1205 22:11:03.646128 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 22:11:03 crc kubenswrapper[4747]: I1205 22:11:03.649035 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:11:04 crc kubenswrapper[4747]: I1205 22:11:04.067647 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d8133d84-9dd2-4780-9414-ae4e7b78884d","Type":"ContainerStarted","Data":"dba6ecf5702fd22446b3056fd9564f54c4443e9830a6f0aea4b2a2df23c606c4"} Dec 05 22:11:05 crc kubenswrapper[4747]: I1205 22:11:05.079248 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d8133d84-9dd2-4780-9414-ae4e7b78884d","Type":"ContainerStarted","Data":"69fedae6faa4293d62267877273722542dfaffcdf5d29143032a646e359ee267"} Dec 05 22:11:05 crc kubenswrapper[4747]: I1205 22:11:05.107901 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.553074919 podStartE2EDuration="4.107876456s" podCreationTimestamp="2025-12-05 22:11:01 +0000 UTC" firstStartedPulling="2025-12-05 22:11:03.648548886 +0000 UTC m=+5334.115856404" lastFinishedPulling="2025-12-05 22:11:04.203350413 +0000 UTC m=+5334.670657941" observedRunningTime="2025-12-05 22:11:05.095672361 +0000 UTC m=+5335.562979929" watchObservedRunningTime="2025-12-05 22:11:05.107876456 +0000 UTC m=+5335.575183954" Dec 05 22:11:06 crc kubenswrapper[4747]: I1205 22:11:06.582872 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:11:06 crc kubenswrapper[4747]: I1205 22:11:06.670176 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:11:06 crc kubenswrapper[4747]: I1205 22:11:06.670776 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="dnsmasq-dns" containerID="cri-o://aad08731ff35ef3e9d19f6fe4363906dd55bdd048bee9ce9710d472b8cd7bf6d" gracePeriod=10 Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.114669 4747 generic.go:334] "Generic (PLEG): container finished" podID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerID="aad08731ff35ef3e9d19f6fe4363906dd55bdd048bee9ce9710d472b8cd7bf6d" exitCode=0 Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.114711 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" event={"ID":"7be5844e-65a8-4a1a-b453-31fe2bae1a6c","Type":"ContainerDied","Data":"aad08731ff35ef3e9d19f6fe4363906dd55bdd048bee9ce9710d472b8cd7bf6d"} Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.114747 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" event={"ID":"7be5844e-65a8-4a1a-b453-31fe2bae1a6c","Type":"ContainerDied","Data":"f4e34ee706954c917f328ef97755c3f31610cbcd3b92a7f243ebfed8cd49e9ff"} Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.114760 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e34ee706954c917f328ef97755c3f31610cbcd3b92a7f243ebfed8cd49e9ff" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.142861 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.261947 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkw9x\" (UniqueName: \"kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x\") pod \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.262082 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc\") pod \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.262134 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config\") pod \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\" (UID: \"7be5844e-65a8-4a1a-b453-31fe2bae1a6c\") " Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.269821 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x" (OuterVolumeSpecName: "kube-api-access-hkw9x") pod "7be5844e-65a8-4a1a-b453-31fe2bae1a6c" (UID: "7be5844e-65a8-4a1a-b453-31fe2bae1a6c"). InnerVolumeSpecName "kube-api-access-hkw9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.303546 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config" (OuterVolumeSpecName: "config") pod "7be5844e-65a8-4a1a-b453-31fe2bae1a6c" (UID: "7be5844e-65a8-4a1a-b453-31fe2bae1a6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.329535 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7be5844e-65a8-4a1a-b453-31fe2bae1a6c" (UID: "7be5844e-65a8-4a1a-b453-31fe2bae1a6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.364177 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkw9x\" (UniqueName: \"kubernetes.io/projected/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-kube-api-access-hkw9x\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.364216 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:07 crc kubenswrapper[4747]: I1205 22:11:07.364234 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7be5844e-65a8-4a1a-b453-31fe2bae1a6c-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:08 crc kubenswrapper[4747]: I1205 22:11:08.125041 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-sd6tf" Dec 05 22:11:08 crc kubenswrapper[4747]: I1205 22:11:08.160931 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:11:08 crc kubenswrapper[4747]: I1205 22:11:08.174205 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-sd6tf"] Dec 05 22:11:09 crc kubenswrapper[4747]: I1205 22:11:09.859773 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" path="/var/lib/kubelet/pods/7be5844e-65a8-4a1a-b453-31fe2bae1a6c/volumes" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.433627 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 22:11:10 crc kubenswrapper[4747]: E1205 22:11:10.433988 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="dnsmasq-dns" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.434010 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="dnsmasq-dns" Dec 05 22:11:10 crc kubenswrapper[4747]: E1205 22:11:10.434044 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="init" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.434052 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="init" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.434233 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be5844e-65a8-4a1a-b453-31fe2bae1a6c" containerName="dnsmasq-dns" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.435251 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.436892 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.437427 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.437609 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pmsfm" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.437744 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.533204 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627245 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627430 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-scripts\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627757 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.627994 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-config\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.628036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8jh\" (UniqueName: \"kubernetes.io/projected/1d2c2179-ea52-4d02-b1de-9972369db4c2-kube-api-access-8t8jh\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730157 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-config\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8jh\" (UniqueName: \"kubernetes.io/projected/1d2c2179-ea52-4d02-b1de-9972369db4c2-kube-api-access-8t8jh\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730362 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730432 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-scripts\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730463 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.730493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.731282 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.731471 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-config\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.731472 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d2c2179-ea52-4d02-b1de-9972369db4c2-scripts\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.736871 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.740459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.754327 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2c2179-ea52-4d02-b1de-9972369db4c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.765383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8jh\" (UniqueName: \"kubernetes.io/projected/1d2c2179-ea52-4d02-b1de-9972369db4c2-kube-api-access-8t8jh\") pod \"ovn-northd-0\" (UID: \"1d2c2179-ea52-4d02-b1de-9972369db4c2\") " pod="openstack/ovn-northd-0" Dec 05 22:11:10 crc kubenswrapper[4747]: I1205 22:11:10.772941 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 22:11:11 crc kubenswrapper[4747]: I1205 22:11:11.268124 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 22:11:12 crc kubenswrapper[4747]: I1205 22:11:12.166926 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d2c2179-ea52-4d02-b1de-9972369db4c2","Type":"ContainerStarted","Data":"2841f1fca35f1e0d48ac607903ef555e03ce6d9047771c58296d632869c12ef6"} Dec 05 22:11:12 crc kubenswrapper[4747]: I1205 22:11:12.167756 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 22:11:12 crc kubenswrapper[4747]: I1205 22:11:12.167796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d2c2179-ea52-4d02-b1de-9972369db4c2","Type":"ContainerStarted","Data":"07b2be435902e2d3106bf4362ad2de70dc42c6695772af717e847d4b8d360327"} Dec 05 22:11:12 crc kubenswrapper[4747]: I1205 22:11:12.167825 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d2c2179-ea52-4d02-b1de-9972369db4c2","Type":"ContainerStarted","Data":"07f36926235042b3692ca061f718e1fd091639a14e7f0a254c314b868aae173c"} Dec 05 22:11:12 crc kubenswrapper[4747]: I1205 22:11:12.200202 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.200145956 podStartE2EDuration="2.200145956s" podCreationTimestamp="2025-12-05 22:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:12.190314001 +0000 UTC m=+5342.657621509" watchObservedRunningTime="2025-12-05 22:11:12.200145956 +0000 UTC m=+5342.667453484" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.195999 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gf5qt"] Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.198248 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.207846 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gf5qt"] Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.216732 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-afce-account-create-update-psv26"] Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.216811 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.216907 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj644\" (UniqueName: \"kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.217811 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.222363 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.234134 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-afce-account-create-update-psv26"] Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.318859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.318916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.318966 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj644\" (UniqueName: \"kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.319164 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwxb\" (UniqueName: \"kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.320178 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.338102 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj644\" (UniqueName: \"kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644\") pod \"keystone-db-create-gf5qt\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.419734 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwxb\" (UniqueName: \"kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.419792 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.420993 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.441384 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwxb\" (UniqueName: \"kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb\") pod \"keystone-afce-account-create-update-psv26\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.517072 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.534029 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.970721 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gf5qt"] Dec 05 22:11:15 crc kubenswrapper[4747]: I1205 22:11:15.983638 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-afce-account-create-update-psv26"] Dec 05 22:11:15 crc kubenswrapper[4747]: W1205 22:11:15.987752 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb674708_f83c_4126_93bd_f88a3a322002.slice/crio-b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097 WatchSource:0}: Error finding container b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097: Status 404 returned error can't find the container with id b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097 Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.210886 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gf5qt" event={"ID":"b4048a8d-ce85-43ff-89f9-684d376444ad","Type":"ContainerStarted","Data":"9081f3b6fe644f529bb78a3002403a76a3bdf2a37d7e71bf6a6a5ccd05808621"} Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.211249 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gf5qt" event={"ID":"b4048a8d-ce85-43ff-89f9-684d376444ad","Type":"ContainerStarted","Data":"4b81186de3a6b289eab4374f4999acf20c9ae1ee68f5e62ff4ccae709607a015"} Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.212986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-afce-account-create-update-psv26" event={"ID":"bb674708-f83c-4126-93bd-f88a3a322002","Type":"ContainerStarted","Data":"b43b3b09f682d9f3e640ddac0018992e99a7eb131932256e30dc3fbc82390016"} Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.213017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-afce-account-create-update-psv26" event={"ID":"bb674708-f83c-4126-93bd-f88a3a322002","Type":"ContainerStarted","Data":"b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097"} Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.231658 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-gf5qt" podStartSLOduration=1.231635479 podStartE2EDuration="1.231635479s" podCreationTimestamp="2025-12-05 22:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:16.229220769 +0000 UTC m=+5346.696528297" watchObservedRunningTime="2025-12-05 22:11:16.231635479 +0000 UTC m=+5346.698942967" Dec 05 22:11:16 crc kubenswrapper[4747]: I1205 22:11:16.254897 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-afce-account-create-update-psv26" podStartSLOduration=1.25487533 podStartE2EDuration="1.25487533s" podCreationTimestamp="2025-12-05 22:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:16.242677805 +0000 UTC m=+5346.709985323" watchObservedRunningTime="2025-12-05 22:11:16.25487533 +0000 UTC m=+5346.722182818" Dec 05 22:11:17 crc kubenswrapper[4747]: I1205 22:11:17.223134 4747 generic.go:334] "Generic (PLEG): container finished" podID="b4048a8d-ce85-43ff-89f9-684d376444ad" containerID="9081f3b6fe644f529bb78a3002403a76a3bdf2a37d7e71bf6a6a5ccd05808621" exitCode=0 Dec 05 22:11:17 crc kubenswrapper[4747]: I1205 22:11:17.223220 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gf5qt" event={"ID":"b4048a8d-ce85-43ff-89f9-684d376444ad","Type":"ContainerDied","Data":"9081f3b6fe644f529bb78a3002403a76a3bdf2a37d7e71bf6a6a5ccd05808621"} Dec 05 22:11:17 crc kubenswrapper[4747]: I1205 22:11:17.226051 4747 generic.go:334] "Generic (PLEG): container finished" podID="bb674708-f83c-4126-93bd-f88a3a322002" containerID="b43b3b09f682d9f3e640ddac0018992e99a7eb131932256e30dc3fbc82390016" exitCode=0 Dec 05 22:11:17 crc kubenswrapper[4747]: I1205 22:11:17.226077 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-afce-account-create-update-psv26" event={"ID":"bb674708-f83c-4126-93bd-f88a3a322002","Type":"ContainerDied","Data":"b43b3b09f682d9f3e640ddac0018992e99a7eb131932256e30dc3fbc82390016"} Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.780380 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.789269 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.875022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkwxb\" (UniqueName: \"kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb\") pod \"bb674708-f83c-4126-93bd-f88a3a322002\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.875097 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts\") pod \"bb674708-f83c-4126-93bd-f88a3a322002\" (UID: \"bb674708-f83c-4126-93bd-f88a3a322002\") " Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.875527 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb674708-f83c-4126-93bd-f88a3a322002" (UID: "bb674708-f83c-4126-93bd-f88a3a322002"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.875885 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb674708-f83c-4126-93bd-f88a3a322002-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.879929 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb" (OuterVolumeSpecName: "kube-api-access-pkwxb") pod "bb674708-f83c-4126-93bd-f88a3a322002" (UID: "bb674708-f83c-4126-93bd-f88a3a322002"). InnerVolumeSpecName "kube-api-access-pkwxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.977023 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj644\" (UniqueName: \"kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644\") pod \"b4048a8d-ce85-43ff-89f9-684d376444ad\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.977495 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts\") pod \"b4048a8d-ce85-43ff-89f9-684d376444ad\" (UID: \"b4048a8d-ce85-43ff-89f9-684d376444ad\") " Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.978009 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4048a8d-ce85-43ff-89f9-684d376444ad" (UID: "b4048a8d-ce85-43ff-89f9-684d376444ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.978767 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkwxb\" (UniqueName: \"kubernetes.io/projected/bb674708-f83c-4126-93bd-f88a3a322002-kube-api-access-pkwxb\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.979039 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4048a8d-ce85-43ff-89f9-684d376444ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:18 crc kubenswrapper[4747]: I1205 22:11:18.981812 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644" (OuterVolumeSpecName: "kube-api-access-tj644") pod "b4048a8d-ce85-43ff-89f9-684d376444ad" (UID: "b4048a8d-ce85-43ff-89f9-684d376444ad"). InnerVolumeSpecName "kube-api-access-tj644". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.081007 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj644\" (UniqueName: \"kubernetes.io/projected/b4048a8d-ce85-43ff-89f9-684d376444ad-kube-api-access-tj644\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.251743 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gf5qt" event={"ID":"b4048a8d-ce85-43ff-89f9-684d376444ad","Type":"ContainerDied","Data":"4b81186de3a6b289eab4374f4999acf20c9ae1ee68f5e62ff4ccae709607a015"} Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.251794 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b81186de3a6b289eab4374f4999acf20c9ae1ee68f5e62ff4ccae709607a015" Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.251808 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gf5qt" Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.254685 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-afce-account-create-update-psv26" event={"ID":"bb674708-f83c-4126-93bd-f88a3a322002","Type":"ContainerDied","Data":"b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097"} Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.254716 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9360c459501e938d88cec751ff416b0ea40b454431c33c7009a8210a52c2097" Dec 05 22:11:19 crc kubenswrapper[4747]: I1205 22:11:19.254835 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-afce-account-create-update-psv26" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.788250 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cvk9x"] Dec 05 22:11:20 crc kubenswrapper[4747]: E1205 22:11:20.788921 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb674708-f83c-4126-93bd-f88a3a322002" containerName="mariadb-account-create-update" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.788937 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb674708-f83c-4126-93bd-f88a3a322002" containerName="mariadb-account-create-update" Dec 05 22:11:20 crc kubenswrapper[4747]: E1205 22:11:20.788963 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4048a8d-ce85-43ff-89f9-684d376444ad" containerName="mariadb-database-create" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.788972 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4048a8d-ce85-43ff-89f9-684d376444ad" containerName="mariadb-database-create" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.789158 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb674708-f83c-4126-93bd-f88a3a322002" containerName="mariadb-account-create-update" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.789174 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4048a8d-ce85-43ff-89f9-684d376444ad" containerName="mariadb-database-create" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.789796 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.791465 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.791541 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4tt76" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.792294 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.792483 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.808342 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cvk9x"] Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.912380 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.912447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx4hv\" (UniqueName: \"kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:20 crc kubenswrapper[4747]: I1205 22:11:20.912541 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.013496 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.013771 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx4hv\" (UniqueName: \"kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.013886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.018799 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.026018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.030038 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx4hv\" (UniqueName: \"kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv\") pod \"keystone-db-sync-cvk9x\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.109433 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:21 crc kubenswrapper[4747]: I1205 22:11:21.377421 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cvk9x"] Dec 05 22:11:21 crc kubenswrapper[4747]: W1205 22:11:21.387903 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62add4b5_2b83_4753_a553_fd36ec012e1b.slice/crio-c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e WatchSource:0}: Error finding container c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e: Status 404 returned error can't find the container with id c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e Dec 05 22:11:22 crc kubenswrapper[4747]: I1205 22:11:22.282194 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cvk9x" event={"ID":"62add4b5-2b83-4753-a553-fd36ec012e1b","Type":"ContainerStarted","Data":"b1ee109fe7e53a7c9f6c64dbce3e2b385eb261b5d4f6fa8cdfc47ad2cd962a9d"} Dec 05 22:11:22 crc kubenswrapper[4747]: I1205 22:11:22.284634 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cvk9x" event={"ID":"62add4b5-2b83-4753-a553-fd36ec012e1b","Type":"ContainerStarted","Data":"c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e"} Dec 05 22:11:22 crc kubenswrapper[4747]: I1205 22:11:22.309790 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cvk9x" podStartSLOduration=2.30976505 podStartE2EDuration="2.30976505s" podCreationTimestamp="2025-12-05 22:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:22.302079648 +0000 UTC m=+5352.769387146" watchObservedRunningTime="2025-12-05 22:11:22.30976505 +0000 UTC m=+5352.777072548" Dec 05 22:11:24 crc kubenswrapper[4747]: I1205 22:11:24.306014 4747 generic.go:334] "Generic (PLEG): container finished" podID="62add4b5-2b83-4753-a553-fd36ec012e1b" containerID="b1ee109fe7e53a7c9f6c64dbce3e2b385eb261b5d4f6fa8cdfc47ad2cd962a9d" exitCode=0 Dec 05 22:11:24 crc kubenswrapper[4747]: I1205 22:11:24.306184 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cvk9x" event={"ID":"62add4b5-2b83-4753-a553-fd36ec012e1b","Type":"ContainerDied","Data":"b1ee109fe7e53a7c9f6c64dbce3e2b385eb261b5d4f6fa8cdfc47ad2cd962a9d"} Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.646708 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.797488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle\") pod \"62add4b5-2b83-4753-a553-fd36ec012e1b\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.797591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx4hv\" (UniqueName: \"kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv\") pod \"62add4b5-2b83-4753-a553-fd36ec012e1b\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.797616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data\") pod \"62add4b5-2b83-4753-a553-fd36ec012e1b\" (UID: \"62add4b5-2b83-4753-a553-fd36ec012e1b\") " Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.804048 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv" (OuterVolumeSpecName: "kube-api-access-lx4hv") pod "62add4b5-2b83-4753-a553-fd36ec012e1b" (UID: "62add4b5-2b83-4753-a553-fd36ec012e1b"). InnerVolumeSpecName "kube-api-access-lx4hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.822720 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62add4b5-2b83-4753-a553-fd36ec012e1b" (UID: "62add4b5-2b83-4753-a553-fd36ec012e1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.869861 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data" (OuterVolumeSpecName: "config-data") pod "62add4b5-2b83-4753-a553-fd36ec012e1b" (UID: "62add4b5-2b83-4753-a553-fd36ec012e1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.900036 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.900066 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx4hv\" (UniqueName: \"kubernetes.io/projected/62add4b5-2b83-4753-a553-fd36ec012e1b-kube-api-access-lx4hv\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.900076 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62add4b5-2b83-4753-a553-fd36ec012e1b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:25 crc kubenswrapper[4747]: I1205 22:11:25.923867 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.326511 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cvk9x" event={"ID":"62add4b5-2b83-4753-a553-fd36ec012e1b","Type":"ContainerDied","Data":"c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e"} Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.326575 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cvk9x" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.326654 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0857a34ef95937dd4518cf43c7c902f85cee2ef49554f6957b5bf36a058de1e" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.555787 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:11:26 crc kubenswrapper[4747]: E1205 22:11:26.556308 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62add4b5-2b83-4753-a553-fd36ec012e1b" containerName="keystone-db-sync" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.556327 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62add4b5-2b83-4753-a553-fd36ec012e1b" containerName="keystone-db-sync" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.556480 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="62add4b5-2b83-4753-a553-fd36ec012e1b" containerName="keystone-db-sync" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.557304 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.598080 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wx9t2"] Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.599119 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.607079 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.607310 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.608062 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.608239 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.610753 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4tt76" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.621776 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wx9t2"] Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.632271 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715684 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715747 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715786 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715803 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715827 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715884 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtfv\" (UniqueName: \"kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.715958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.716091 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plpvf\" (UniqueName: \"kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.716129 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818036 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818157 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818176 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818196 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtfv\" (UniqueName: \"kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818272 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plpvf\" (UniqueName: \"kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818289 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.818320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.819816 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.820687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.821659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.821920 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.821993 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.822013 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.822087 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.830929 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.831396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.836523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plpvf\" (UniqueName: \"kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf\") pod \"keystone-bootstrap-wx9t2\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.837153 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtfv\" (UniqueName: \"kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv\") pod \"dnsmasq-dns-756477f68c-vr5gt\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.877850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:26 crc kubenswrapper[4747]: I1205 22:11:26.915973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:27 crc kubenswrapper[4747]: I1205 22:11:27.370011 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:11:27 crc kubenswrapper[4747]: I1205 22:11:27.521371 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wx9t2"] Dec 05 22:11:27 crc kubenswrapper[4747]: W1205 22:11:27.523353 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1ddf7cf_afb7_4eb0_ac52_1ea5ed80c7e3.slice/crio-ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c WatchSource:0}: Error finding container ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c: Status 404 returned error can't find the container with id ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.347872 4747 generic.go:334] "Generic (PLEG): container finished" podID="87950a32-a688-49e7-ab86-7057f156e114" containerID="6f0585d4ef7cf485c90752abeceddf0155acef00ddba985c84b90719e504ffb6" exitCode=0 Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.347977 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" event={"ID":"87950a32-a688-49e7-ab86-7057f156e114","Type":"ContainerDied","Data":"6f0585d4ef7cf485c90752abeceddf0155acef00ddba985c84b90719e504ffb6"} Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.348304 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" event={"ID":"87950a32-a688-49e7-ab86-7057f156e114","Type":"ContainerStarted","Data":"034692be94372df2a43a288de1d4ecf3d89b345f9e90ebeaf07d42182210327e"} Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.364670 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx9t2" event={"ID":"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3","Type":"ContainerStarted","Data":"9f021ef4e23e1b338cc303cdafd313e622f6613e22b8426b355b12ed35aa410d"} Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.364716 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx9t2" event={"ID":"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3","Type":"ContainerStarted","Data":"ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c"} Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.410392 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wx9t2" podStartSLOduration=2.410372433 podStartE2EDuration="2.410372433s" podCreationTimestamp="2025-12-05 22:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:28.40026769 +0000 UTC m=+5358.867575168" watchObservedRunningTime="2025-12-05 22:11:28.410372433 +0000 UTC m=+5358.877679921" Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.944869 4747 scope.go:117] "RemoveContainer" containerID="cf7c138ad99b436ad40dc321c3992dba39134f080ac8549124369f8b3d39d4df" Dec 05 22:11:28 crc kubenswrapper[4747]: I1205 22:11:28.976255 4747 scope.go:117] "RemoveContainer" containerID="aad08731ff35ef3e9d19f6fe4363906dd55bdd048bee9ce9710d472b8cd7bf6d" Dec 05 22:11:29 crc kubenswrapper[4747]: I1205 22:11:29.381736 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" event={"ID":"87950a32-a688-49e7-ab86-7057f156e114","Type":"ContainerStarted","Data":"0043b254c9707faa81363b30e91989db2c3b151d82a61880d5fec69870e71294"} Dec 05 22:11:29 crc kubenswrapper[4747]: I1205 22:11:29.381819 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:29 crc kubenswrapper[4747]: I1205 22:11:29.407425 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" podStartSLOduration=3.407407334 podStartE2EDuration="3.407407334s" podCreationTimestamp="2025-12-05 22:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:29.404685596 +0000 UTC m=+5359.871993094" watchObservedRunningTime="2025-12-05 22:11:29.407407334 +0000 UTC m=+5359.874714822" Dec 05 22:11:31 crc kubenswrapper[4747]: I1205 22:11:31.401653 4747 generic.go:334] "Generic (PLEG): container finished" podID="d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" containerID="9f021ef4e23e1b338cc303cdafd313e622f6613e22b8426b355b12ed35aa410d" exitCode=0 Dec 05 22:11:31 crc kubenswrapper[4747]: I1205 22:11:31.401735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx9t2" event={"ID":"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3","Type":"ContainerDied","Data":"9f021ef4e23e1b338cc303cdafd313e622f6613e22b8426b355b12ed35aa410d"} Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.788502 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937063 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plpvf\" (UniqueName: \"kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937531 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937610 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937658 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.937801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle\") pod \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\" (UID: \"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3\") " Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.942658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.942691 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.942906 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf" (OuterVolumeSpecName: "kube-api-access-plpvf") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "kube-api-access-plpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.943673 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts" (OuterVolumeSpecName: "scripts") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.958932 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data" (OuterVolumeSpecName: "config-data") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:32 crc kubenswrapper[4747]: I1205 22:11:32.977475 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" (UID: "d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039710 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039745 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039757 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039768 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039779 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.039789 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plpvf\" (UniqueName: \"kubernetes.io/projected/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3-kube-api-access-plpvf\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.423275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx9t2" event={"ID":"d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3","Type":"ContainerDied","Data":"ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c"} Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.423323 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7b4a1282d1a331ed10582d5cb82bbe128eb15b04589f3686c179dd77f1c45c" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.423334 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx9t2" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.623955 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wx9t2"] Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.630043 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wx9t2"] Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.732745 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-czqjs"] Dec 05 22:11:33 crc kubenswrapper[4747]: E1205 22:11:33.733127 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" containerName="keystone-bootstrap" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.733142 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" containerName="keystone-bootstrap" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.733325 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" containerName="keystone-bootstrap" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.734000 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.735841 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.736126 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.739636 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.741788 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4tt76" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.742355 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.767146 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czqjs"] Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.855635 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3" path="/var/lib/kubelet/pods/d1ddf7cf-afb7-4eb0-ac52-1ea5ed80c7e3/volumes" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856548 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmvc7\" (UniqueName: \"kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856606 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856650 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856695 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856834 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.856988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.959466 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmvc7\" (UniqueName: \"kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.959524 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.959630 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.959713 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.959772 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.960049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.968234 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.969343 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.970230 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.971299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.973125 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:33 crc kubenswrapper[4747]: I1205 22:11:33.981304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmvc7\" (UniqueName: \"kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7\") pod \"keystone-bootstrap-czqjs\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:34 crc kubenswrapper[4747]: I1205 22:11:34.060387 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:34 crc kubenswrapper[4747]: I1205 22:11:34.544163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czqjs"] Dec 05 22:11:34 crc kubenswrapper[4747]: W1205 22:11:34.551210 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5638839_25e0_4009_9cca_59dea6eb0612.slice/crio-2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36 WatchSource:0}: Error finding container 2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36: Status 404 returned error can't find the container with id 2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36 Dec 05 22:11:35 crc kubenswrapper[4747]: I1205 22:11:35.443284 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czqjs" event={"ID":"b5638839-25e0-4009-9cca-59dea6eb0612","Type":"ContainerStarted","Data":"cb063a1338d2fd5004661f21a5dabdc316bb0bc7d295752f985634260faefd47"} Dec 05 22:11:35 crc kubenswrapper[4747]: I1205 22:11:35.444815 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czqjs" event={"ID":"b5638839-25e0-4009-9cca-59dea6eb0612","Type":"ContainerStarted","Data":"2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36"} Dec 05 22:11:35 crc kubenswrapper[4747]: I1205 22:11:35.474442 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-czqjs" podStartSLOduration=2.474412369 podStartE2EDuration="2.474412369s" podCreationTimestamp="2025-12-05 22:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:35.465741932 +0000 UTC m=+5365.933049490" watchObservedRunningTime="2025-12-05 22:11:35.474412369 +0000 UTC m=+5365.941719897" Dec 05 22:11:36 crc kubenswrapper[4747]: I1205 22:11:36.879680 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:11:36 crc kubenswrapper[4747]: I1205 22:11:36.942463 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:11:36 crc kubenswrapper[4747]: I1205 22:11:36.942750 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="dnsmasq-dns" containerID="cri-o://7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af" gracePeriod=10 Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.455794 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.459634 4747 generic.go:334] "Generic (PLEG): container finished" podID="720cb589-2deb-499c-a051-0af6e76968cf" containerID="7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af" exitCode=0 Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.459701 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.459733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" event={"ID":"720cb589-2deb-499c-a051-0af6e76968cf","Type":"ContainerDied","Data":"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af"} Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.459781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99ccddc9-xvc75" event={"ID":"720cb589-2deb-499c-a051-0af6e76968cf","Type":"ContainerDied","Data":"a2d4b89b825fb84b5f86c7ef24896f6516f24c3937a8cd24e565a75a349c1bae"} Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.459848 4747 scope.go:117] "RemoveContainer" containerID="7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.461569 4747 generic.go:334] "Generic (PLEG): container finished" podID="b5638839-25e0-4009-9cca-59dea6eb0612" containerID="cb063a1338d2fd5004661f21a5dabdc316bb0bc7d295752f985634260faefd47" exitCode=0 Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.461620 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czqjs" event={"ID":"b5638839-25e0-4009-9cca-59dea6eb0612","Type":"ContainerDied","Data":"cb063a1338d2fd5004661f21a5dabdc316bb0bc7d295752f985634260faefd47"} Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.479876 4747 scope.go:117] "RemoveContainer" containerID="161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.522905 4747 scope.go:117] "RemoveContainer" containerID="7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af" Dec 05 22:11:37 crc kubenswrapper[4747]: E1205 22:11:37.524029 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af\": container with ID starting with 7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af not found: ID does not exist" containerID="7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.524075 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af"} err="failed to get container status \"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af\": rpc error: code = NotFound desc = could not find container \"7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af\": container with ID starting with 7ee36529f21c0e0076099b1764b97b8f815392cf1691acaf10eb07ca5a1f20af not found: ID does not exist" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.524095 4747 scope.go:117] "RemoveContainer" containerID="161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee" Dec 05 22:11:37 crc kubenswrapper[4747]: E1205 22:11:37.524561 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee\": container with ID starting with 161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee not found: ID does not exist" containerID="161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.524664 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee"} err="failed to get container status \"161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee\": rpc error: code = NotFound desc = could not find container \"161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee\": container with ID starting with 161c608d2053d658c6b390fdefaccbbfe67acbd87b02225ea7d053c4feca1bee not found: ID does not exist" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.624729 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb\") pod \"720cb589-2deb-499c-a051-0af6e76968cf\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.624805 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8vs\" (UniqueName: \"kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs\") pod \"720cb589-2deb-499c-a051-0af6e76968cf\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.624879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb\") pod \"720cb589-2deb-499c-a051-0af6e76968cf\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.624914 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config\") pod \"720cb589-2deb-499c-a051-0af6e76968cf\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.624982 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc\") pod \"720cb589-2deb-499c-a051-0af6e76968cf\" (UID: \"720cb589-2deb-499c-a051-0af6e76968cf\") " Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.631407 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs" (OuterVolumeSpecName: "kube-api-access-cz8vs") pod "720cb589-2deb-499c-a051-0af6e76968cf" (UID: "720cb589-2deb-499c-a051-0af6e76968cf"). InnerVolumeSpecName "kube-api-access-cz8vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.673266 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "720cb589-2deb-499c-a051-0af6e76968cf" (UID: "720cb589-2deb-499c-a051-0af6e76968cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.678098 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "720cb589-2deb-499c-a051-0af6e76968cf" (UID: "720cb589-2deb-499c-a051-0af6e76968cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.679384 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config" (OuterVolumeSpecName: "config") pod "720cb589-2deb-499c-a051-0af6e76968cf" (UID: "720cb589-2deb-499c-a051-0af6e76968cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.681682 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "720cb589-2deb-499c-a051-0af6e76968cf" (UID: "720cb589-2deb-499c-a051-0af6e76968cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.727029 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.727327 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.727503 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.727665 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720cb589-2deb-499c-a051-0af6e76968cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.727787 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz8vs\" (UniqueName: \"kubernetes.io/projected/720cb589-2deb-499c-a051-0af6e76968cf-kube-api-access-cz8vs\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.808825 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.816010 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d99ccddc9-xvc75"] Dec 05 22:11:37 crc kubenswrapper[4747]: I1205 22:11:37.861338 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720cb589-2deb-499c-a051-0af6e76968cf" path="/var/lib/kubelet/pods/720cb589-2deb-499c-a051-0af6e76968cf/volumes" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.862477 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.947137 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.947259 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmvc7\" (UniqueName: \"kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.947318 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.947466 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.947715 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.948355 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle\") pod \"b5638839-25e0-4009-9cca-59dea6eb0612\" (UID: \"b5638839-25e0-4009-9cca-59dea6eb0612\") " Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.952619 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts" (OuterVolumeSpecName: "scripts") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.952865 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.953147 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7" (OuterVolumeSpecName: "kube-api-access-qmvc7") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "kube-api-access-qmvc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.954407 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.979932 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:38 crc kubenswrapper[4747]: I1205 22:11:38.992139 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data" (OuterVolumeSpecName: "config-data") pod "b5638839-25e0-4009-9cca-59dea6eb0612" (UID: "b5638839-25e0-4009-9cca-59dea6eb0612"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050258 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050305 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050320 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050334 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050346 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmvc7\" (UniqueName: \"kubernetes.io/projected/b5638839-25e0-4009-9cca-59dea6eb0612-kube-api-access-qmvc7\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.050357 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5638839-25e0-4009-9cca-59dea6eb0612-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.485142 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czqjs" event={"ID":"b5638839-25e0-4009-9cca-59dea6eb0612","Type":"ContainerDied","Data":"2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36"} Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.485295 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2194cd2ff8d533dcb2a14459a0bcb0afcf1cfd807a7b5563bc65a3a88f98ce36" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.485400 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czqjs" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.585498 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55748bc777-pzh2t"] Dec 05 22:11:39 crc kubenswrapper[4747]: E1205 22:11:39.585916 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="dnsmasq-dns" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.585940 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="dnsmasq-dns" Dec 05 22:11:39 crc kubenswrapper[4747]: E1205 22:11:39.585961 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5638839-25e0-4009-9cca-59dea6eb0612" containerName="keystone-bootstrap" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.585971 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5638839-25e0-4009-9cca-59dea6eb0612" containerName="keystone-bootstrap" Dec 05 22:11:39 crc kubenswrapper[4747]: E1205 22:11:39.586005 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="init" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.586015 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="init" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.586205 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="720cb589-2deb-499c-a051-0af6e76968cf" containerName="dnsmasq-dns" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.586236 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5638839-25e0-4009-9cca-59dea6eb0612" containerName="keystone-bootstrap" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.586926 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593293 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593379 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593393 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593456 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593468 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4tt76" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.593580 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.595466 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55748bc777-pzh2t"] Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658351 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-public-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-scripts\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-internal-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-credential-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658686 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-config-data\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658713 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-combined-ca-bundle\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658746 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-fernet-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.658774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcp2d\" (UniqueName: \"kubernetes.io/projected/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-kube-api-access-hcp2d\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760253 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-public-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760310 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-scripts\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-internal-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760457 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-credential-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760527 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-config-data\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760560 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-combined-ca-bundle\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760619 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-fernet-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.760644 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcp2d\" (UniqueName: \"kubernetes.io/projected/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-kube-api-access-hcp2d\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.764529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-internal-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.764703 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-combined-ca-bundle\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.764754 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-credential-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.765333 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-fernet-keys\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.765820 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-public-tls-certs\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.766054 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-scripts\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.766069 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-config-data\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.783027 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcp2d\" (UniqueName: \"kubernetes.io/projected/13d471a9-2ca8-46d2-8be1-3b2354fe8fbb-kube-api-access-hcp2d\") pod \"keystone-55748bc777-pzh2t\" (UID: \"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb\") " pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:39 crc kubenswrapper[4747]: I1205 22:11:39.907856 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:40 crc kubenswrapper[4747]: I1205 22:11:40.366537 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55748bc777-pzh2t"] Dec 05 22:11:40 crc kubenswrapper[4747]: W1205 22:11:40.389064 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13d471a9_2ca8_46d2_8be1_3b2354fe8fbb.slice/crio-2d91eb0b568dd5992d973ef3df69d04339372b7dfe04782df670a93deea223d5 WatchSource:0}: Error finding container 2d91eb0b568dd5992d973ef3df69d04339372b7dfe04782df670a93deea223d5: Status 404 returned error can't find the container with id 2d91eb0b568dd5992d973ef3df69d04339372b7dfe04782df670a93deea223d5 Dec 05 22:11:40 crc kubenswrapper[4747]: I1205 22:11:40.495442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55748bc777-pzh2t" event={"ID":"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb","Type":"ContainerStarted","Data":"2d91eb0b568dd5992d973ef3df69d04339372b7dfe04782df670a93deea223d5"} Dec 05 22:11:41 crc kubenswrapper[4747]: I1205 22:11:41.505414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55748bc777-pzh2t" event={"ID":"13d471a9-2ca8-46d2-8be1-3b2354fe8fbb","Type":"ContainerStarted","Data":"53edf2e99881374b506691d63b6ea1591deff7c2fc6f1822886b3ef4f485ca1b"} Dec 05 22:11:41 crc kubenswrapper[4747]: I1205 22:11:41.505835 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:11:41 crc kubenswrapper[4747]: I1205 22:11:41.534548 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55748bc777-pzh2t" podStartSLOduration=2.534525279 podStartE2EDuration="2.534525279s" podCreationTimestamp="2025-12-05 22:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:11:41.529126015 +0000 UTC m=+5371.996433503" watchObservedRunningTime="2025-12-05 22:11:41.534525279 +0000 UTC m=+5372.001832807" Dec 05 22:12:11 crc kubenswrapper[4747]: I1205 22:12:11.651906 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55748bc777-pzh2t" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.164447 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.167110 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.173074 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fzqvh" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.173556 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.173962 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.182123 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.239290 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.239446 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.239524 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qblg\" (UniqueName: \"kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.239633 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.341289 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.341354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qblg\" (UniqueName: \"kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.341414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.341509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.342635 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.350436 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.350481 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.365793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qblg\" (UniqueName: \"kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg\") pod \"openstackclient\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.511469 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:12:15 crc kubenswrapper[4747]: I1205 22:12:15.953866 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:12:16 crc kubenswrapper[4747]: I1205 22:12:16.872643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b0df9ca5-9db8-4c89-b160-154db180c7cc","Type":"ContainerStarted","Data":"4b35cd217d29af685a661724277f692ab3128f1e51e80878f2b33fb91ac7b314"} Dec 05 22:12:16 crc kubenswrapper[4747]: I1205 22:12:16.873028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b0df9ca5-9db8-4c89-b160-154db180c7cc","Type":"ContainerStarted","Data":"50e62c43e081a7afc984b0fa3a7ca2a36f1b57efb34b9295c4e5e06ebb75567c"} Dec 05 22:12:16 crc kubenswrapper[4747]: I1205 22:12:16.896575 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.8965551349999998 podStartE2EDuration="1.896555135s" podCreationTimestamp="2025-12-05 22:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:12:16.891440637 +0000 UTC m=+5407.358748185" watchObservedRunningTime="2025-12-05 22:12:16.896555135 +0000 UTC m=+5407.363862633" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.100731 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.104789 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.109104 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.206742 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.206812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pln6t\" (UniqueName: \"kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.206881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.308713 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.308789 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pln6t\" (UniqueName: \"kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.308840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.309160 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.309269 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.328811 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pln6t\" (UniqueName: \"kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t\") pod \"community-operators-wsbtv\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.428789 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.755938 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.976282 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerStarted","Data":"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3"} Dec 05 22:12:24 crc kubenswrapper[4747]: I1205 22:12:24.976374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerStarted","Data":"12cf93d2f0fb3dff7f1f4dc1f506e29350af504a182a753e697fb0d1d2cc5b17"} Dec 05 22:12:25 crc kubenswrapper[4747]: I1205 22:12:25.989743 4747 generic.go:334] "Generic (PLEG): container finished" podID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerID="ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3" exitCode=0 Dec 05 22:12:25 crc kubenswrapper[4747]: I1205 22:12:25.990017 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerDied","Data":"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3"} Dec 05 22:12:27 crc kubenswrapper[4747]: I1205 22:12:27.001121 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerStarted","Data":"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1"} Dec 05 22:12:28 crc kubenswrapper[4747]: I1205 22:12:28.013551 4747 generic.go:334] "Generic (PLEG): container finished" podID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerID="9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1" exitCode=0 Dec 05 22:12:28 crc kubenswrapper[4747]: I1205 22:12:28.013642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerDied","Data":"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1"} Dec 05 22:12:29 crc kubenswrapper[4747]: I1205 22:12:29.024448 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerStarted","Data":"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0"} Dec 05 22:12:29 crc kubenswrapper[4747]: I1205 22:12:29.049927 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wsbtv" podStartSLOduration=2.644023713 podStartE2EDuration="5.049911815s" podCreationTimestamp="2025-12-05 22:12:24 +0000 UTC" firstStartedPulling="2025-12-05 22:12:26.011125105 +0000 UTC m=+5416.478432593" lastFinishedPulling="2025-12-05 22:12:28.417013217 +0000 UTC m=+5418.884320695" observedRunningTime="2025-12-05 22:12:29.046419418 +0000 UTC m=+5419.513726916" watchObservedRunningTime="2025-12-05 22:12:29.049911815 +0000 UTC m=+5419.517219303" Dec 05 22:12:29 crc kubenswrapper[4747]: I1205 22:12:29.073999 4747 scope.go:117] "RemoveContainer" containerID="261c0542851e335016bf25fef1ec464158ad43e301561f58b5dd12573d8fca9f" Dec 05 22:12:29 crc kubenswrapper[4747]: I1205 22:12:29.101987 4747 scope.go:117] "RemoveContainer" containerID="e8fd5fcb110707081f3ec156a060c9832a3f67461b11fa636f22384afddc4870" Dec 05 22:12:34 crc kubenswrapper[4747]: I1205 22:12:34.429094 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:34 crc kubenswrapper[4747]: I1205 22:12:34.429739 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:34 crc kubenswrapper[4747]: I1205 22:12:34.474957 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:35 crc kubenswrapper[4747]: I1205 22:12:35.166803 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:35 crc kubenswrapper[4747]: I1205 22:12:35.226463 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:37 crc kubenswrapper[4747]: I1205 22:12:37.108729 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wsbtv" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="registry-server" containerID="cri-o://91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0" gracePeriod=2 Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.118191 4747 generic.go:334] "Generic (PLEG): container finished" podID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerID="91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0" exitCode=0 Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.118270 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.118275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerDied","Data":"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0"} Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.118628 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsbtv" event={"ID":"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0","Type":"ContainerDied","Data":"12cf93d2f0fb3dff7f1f4dc1f506e29350af504a182a753e697fb0d1d2cc5b17"} Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.118650 4747 scope.go:117] "RemoveContainer" containerID="91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.160751 4747 scope.go:117] "RemoveContainer" containerID="9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.184929 4747 scope.go:117] "RemoveContainer" containerID="ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.212720 4747 scope.go:117] "RemoveContainer" containerID="91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0" Dec 05 22:12:38 crc kubenswrapper[4747]: E1205 22:12:38.213107 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0\": container with ID starting with 91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0 not found: ID does not exist" containerID="91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.213162 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0"} err="failed to get container status \"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0\": rpc error: code = NotFound desc = could not find container \"91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0\": container with ID starting with 91361d8d775704c8ab42300400cabc108f2d62602bea0e2ebebd5783d03ec0c0 not found: ID does not exist" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.213191 4747 scope.go:117] "RemoveContainer" containerID="9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1" Dec 05 22:12:38 crc kubenswrapper[4747]: E1205 22:12:38.213772 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1\": container with ID starting with 9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1 not found: ID does not exist" containerID="9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.213803 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1"} err="failed to get container status \"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1\": rpc error: code = NotFound desc = could not find container \"9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1\": container with ID starting with 9cc2b5cded7a0d65f2a35ed8c2be2f5e758ea95835e6e1960aef363ae94a64e1 not found: ID does not exist" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.213820 4747 scope.go:117] "RemoveContainer" containerID="ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3" Dec 05 22:12:38 crc kubenswrapper[4747]: E1205 22:12:38.214113 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3\": container with ID starting with ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3 not found: ID does not exist" containerID="ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.214145 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3"} err="failed to get container status \"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3\": rpc error: code = NotFound desc = could not find container \"ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3\": container with ID starting with ff1f570e7b47718713dc43fdb15919d433ec6c88d8c45cada06729366eeccfb3 not found: ID does not exist" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.287259 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pln6t\" (UniqueName: \"kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t\") pod \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.287333 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content\") pod \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.287379 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities\") pod \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\" (UID: \"46c97a11-4052-49e5-9ee6-a11ea6bbf4d0\") " Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.288831 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities" (OuterVolumeSpecName: "utilities") pod "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" (UID: "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.294775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t" (OuterVolumeSpecName: "kube-api-access-pln6t") pod "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" (UID: "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0"). InnerVolumeSpecName "kube-api-access-pln6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.335977 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" (UID: "46c97a11-4052-49e5-9ee6-a11ea6bbf4d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.389183 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.389222 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pln6t\" (UniqueName: \"kubernetes.io/projected/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-kube-api-access-pln6t\") on node \"crc\" DevicePath \"\"" Dec 05 22:12:38 crc kubenswrapper[4747]: I1205 22:12:38.389232 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:12:39 crc kubenswrapper[4747]: I1205 22:12:39.130501 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsbtv" Dec 05 22:12:39 crc kubenswrapper[4747]: I1205 22:12:39.188302 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:39 crc kubenswrapper[4747]: I1205 22:12:39.197485 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wsbtv"] Dec 05 22:12:39 crc kubenswrapper[4747]: I1205 22:12:39.853132 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" path="/var/lib/kubelet/pods/46c97a11-4052-49e5-9ee6-a11ea6bbf4d0/volumes" Dec 05 22:13:06 crc kubenswrapper[4747]: I1205 22:13:06.222177 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:13:06 crc kubenswrapper[4747]: I1205 22:13:06.222814 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:13:29 crc kubenswrapper[4747]: I1205 22:13:29.224378 4747 scope.go:117] "RemoveContainer" containerID="881fcc9e74b76ad3f351346266c23852ba30c4bccbe891c838469f0431a10650" Dec 05 22:13:29 crc kubenswrapper[4747]: I1205 22:13:29.256122 4747 scope.go:117] "RemoveContainer" containerID="2c32f3b8108412e754e83dba22532ece02fcd8dde771983f1aee6f529357d771" Dec 05 22:13:29 crc kubenswrapper[4747]: I1205 22:13:29.307002 4747 scope.go:117] "RemoveContainer" containerID="941b2eb52a94b9dde863dddd6ef688781d652e7564516f60e67db23a1e556337" Dec 05 22:13:29 crc kubenswrapper[4747]: I1205 22:13:29.343653 4747 scope.go:117] "RemoveContainer" containerID="0e08484e7c1803a70fd8d969f5942b169b5d041efb726c9885cca85f7a798521" Dec 05 22:13:36 crc kubenswrapper[4747]: I1205 22:13:36.222045 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:13:36 crc kubenswrapper[4747]: I1205 22:13:36.222716 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.267390 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6t6lp"] Dec 05 22:13:51 crc kubenswrapper[4747]: E1205 22:13:51.268125 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="extract-utilities" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.268137 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="extract-utilities" Dec 05 22:13:51 crc kubenswrapper[4747]: E1205 22:13:51.268152 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="registry-server" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.268157 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="registry-server" Dec 05 22:13:51 crc kubenswrapper[4747]: E1205 22:13:51.268183 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="extract-content" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.268190 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="extract-content" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.268334 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c97a11-4052-49e5-9ee6-a11ea6bbf4d0" containerName="registry-server" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.268845 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.279430 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6t6lp"] Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.331821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.332104 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rww69\" (UniqueName: \"kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.367036 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f0b7-account-create-update-pjq57"] Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.368091 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.370035 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.377821 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f0b7-account-create-update-pjq57"] Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.433810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.433858 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rww69\" (UniqueName: \"kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.433914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvr7n\" (UniqueName: \"kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.433940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.434837 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.455498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rww69\" (UniqueName: \"kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69\") pod \"barbican-db-create-6t6lp\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.536035 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvr7n\" (UniqueName: \"kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.536131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.537216 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.553622 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvr7n\" (UniqueName: \"kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n\") pod \"barbican-f0b7-account-create-update-pjq57\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.587302 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.684485 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:51 crc kubenswrapper[4747]: I1205 22:13:51.938137 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6t6lp"] Dec 05 22:13:52 crc kubenswrapper[4747]: W1205 22:13:52.201404 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod498693e6_74d0_4f8c_bfe7_4c1b1d4167f7.slice/crio-65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745 WatchSource:0}: Error finding container 65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745: Status 404 returned error can't find the container with id 65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745 Dec 05 22:13:52 crc kubenswrapper[4747]: I1205 22:13:52.201650 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f0b7-account-create-update-pjq57"] Dec 05 22:13:52 crc kubenswrapper[4747]: I1205 22:13:52.214761 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6t6lp" event={"ID":"a1c15daa-ad48-4db6-bf6d-99de93a266f6","Type":"ContainerStarted","Data":"3605a048a1629283de8e12903ac60bad0b59b43abe5ccdd63cbd741599ce5ce1"} Dec 05 22:13:52 crc kubenswrapper[4747]: I1205 22:13:52.214805 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6t6lp" event={"ID":"a1c15daa-ad48-4db6-bf6d-99de93a266f6","Type":"ContainerStarted","Data":"d5f2e4c9aa699a80a5289e475e2b69eb244926f23735ee2a1336903e262fb53e"} Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.228615 4747 generic.go:334] "Generic (PLEG): container finished" podID="a1c15daa-ad48-4db6-bf6d-99de93a266f6" containerID="3605a048a1629283de8e12903ac60bad0b59b43abe5ccdd63cbd741599ce5ce1" exitCode=0 Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.228707 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6t6lp" event={"ID":"a1c15daa-ad48-4db6-bf6d-99de93a266f6","Type":"ContainerDied","Data":"3605a048a1629283de8e12903ac60bad0b59b43abe5ccdd63cbd741599ce5ce1"} Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.232381 4747 generic.go:334] "Generic (PLEG): container finished" podID="498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" containerID="7d3a172e63dd2f7ad37bd272427476c676ea65cc7fa822e4ece272edbef8773a" exitCode=0 Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.232429 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0b7-account-create-update-pjq57" event={"ID":"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7","Type":"ContainerDied","Data":"7d3a172e63dd2f7ad37bd272427476c676ea65cc7fa822e4ece272edbef8773a"} Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.232460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0b7-account-create-update-pjq57" event={"ID":"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7","Type":"ContainerStarted","Data":"65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745"} Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.571941 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.670744 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rww69\" (UniqueName: \"kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69\") pod \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.671006 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts\") pod \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\" (UID: \"a1c15daa-ad48-4db6-bf6d-99de93a266f6\") " Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.672370 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1c15daa-ad48-4db6-bf6d-99de93a266f6" (UID: "a1c15daa-ad48-4db6-bf6d-99de93a266f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.678381 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69" (OuterVolumeSpecName: "kube-api-access-rww69") pod "a1c15daa-ad48-4db6-bf6d-99de93a266f6" (UID: "a1c15daa-ad48-4db6-bf6d-99de93a266f6"). InnerVolumeSpecName "kube-api-access-rww69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.772944 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c15daa-ad48-4db6-bf6d-99de93a266f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:13:53 crc kubenswrapper[4747]: I1205 22:13:53.773006 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rww69\" (UniqueName: \"kubernetes.io/projected/a1c15daa-ad48-4db6-bf6d-99de93a266f6-kube-api-access-rww69\") on node \"crc\" DevicePath \"\"" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.245219 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6t6lp" event={"ID":"a1c15daa-ad48-4db6-bf6d-99de93a266f6","Type":"ContainerDied","Data":"d5f2e4c9aa699a80a5289e475e2b69eb244926f23735ee2a1336903e262fb53e"} Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.248200 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5f2e4c9aa699a80a5289e475e2b69eb244926f23735ee2a1336903e262fb53e" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.245256 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6t6lp" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.639539 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.687607 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts\") pod \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.687776 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvr7n\" (UniqueName: \"kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n\") pod \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\" (UID: \"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7\") " Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.688397 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" (UID: "498693e6-74d0-4f8c-bfe7-4c1b1d4167f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.688525 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.693895 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n" (OuterVolumeSpecName: "kube-api-access-tvr7n") pod "498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" (UID: "498693e6-74d0-4f8c-bfe7-4c1b1d4167f7"). InnerVolumeSpecName "kube-api-access-tvr7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:13:54 crc kubenswrapper[4747]: I1205 22:13:54.789510 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvr7n\" (UniqueName: \"kubernetes.io/projected/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7-kube-api-access-tvr7n\") on node \"crc\" DevicePath \"\"" Dec 05 22:13:55 crc kubenswrapper[4747]: I1205 22:13:55.254640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0b7-account-create-update-pjq57" event={"ID":"498693e6-74d0-4f8c-bfe7-4c1b1d4167f7","Type":"ContainerDied","Data":"65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745"} Dec 05 22:13:55 crc kubenswrapper[4747]: I1205 22:13:55.254966 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65c90e2557b3fe4df7ad677442a9d7a9b100e96943843001c8ddbe4ed4414745" Dec 05 22:13:55 crc kubenswrapper[4747]: I1205 22:13:55.254699 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0b7-account-create-update-pjq57" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.694869 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6crtc"] Dec 05 22:13:56 crc kubenswrapper[4747]: E1205 22:13:56.695258 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" containerName="mariadb-account-create-update" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.695271 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" containerName="mariadb-account-create-update" Dec 05 22:13:56 crc kubenswrapper[4747]: E1205 22:13:56.695295 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c15daa-ad48-4db6-bf6d-99de93a266f6" containerName="mariadb-database-create" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.695303 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c15daa-ad48-4db6-bf6d-99de93a266f6" containerName="mariadb-database-create" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.695480 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" containerName="mariadb-account-create-update" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.695498 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c15daa-ad48-4db6-bf6d-99de93a266f6" containerName="mariadb-database-create" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.696172 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.699153 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.703810 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mfdm6" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.708355 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6crtc"] Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.830165 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.830217 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.830236 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vztwj\" (UniqueName: \"kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.932064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.932151 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.932182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vztwj\" (UniqueName: \"kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.944120 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.947903 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:56 crc kubenswrapper[4747]: I1205 22:13:56.948316 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vztwj\" (UniqueName: \"kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj\") pod \"barbican-db-sync-6crtc\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:57 crc kubenswrapper[4747]: I1205 22:13:57.027738 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6crtc" Dec 05 22:13:57 crc kubenswrapper[4747]: I1205 22:13:57.471904 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6crtc"] Dec 05 22:13:58 crc kubenswrapper[4747]: I1205 22:13:58.283936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6crtc" event={"ID":"81e875d4-1d9d-4245-90fc-900eae487b29","Type":"ContainerStarted","Data":"d54c581b75a7ed90484a5bda424c12287dea94aad342b043bbcedbbc2807a614"} Dec 05 22:13:58 crc kubenswrapper[4747]: I1205 22:13:58.283989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6crtc" event={"ID":"81e875d4-1d9d-4245-90fc-900eae487b29","Type":"ContainerStarted","Data":"4d21cca9bd17dd48067e1bdc074a46026d5ead4391f3ca36e10727294fb1091c"} Dec 05 22:13:58 crc kubenswrapper[4747]: I1205 22:13:58.310655 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6crtc" podStartSLOduration=2.310631483 podStartE2EDuration="2.310631483s" podCreationTimestamp="2025-12-05 22:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:13:58.297756332 +0000 UTC m=+5508.765063880" watchObservedRunningTime="2025-12-05 22:13:58.310631483 +0000 UTC m=+5508.777939001" Dec 05 22:14:00 crc kubenswrapper[4747]: I1205 22:14:00.303935 4747 generic.go:334] "Generic (PLEG): container finished" podID="81e875d4-1d9d-4245-90fc-900eae487b29" containerID="d54c581b75a7ed90484a5bda424c12287dea94aad342b043bbcedbbc2807a614" exitCode=0 Dec 05 22:14:00 crc kubenswrapper[4747]: I1205 22:14:00.303973 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6crtc" event={"ID":"81e875d4-1d9d-4245-90fc-900eae487b29","Type":"ContainerDied","Data":"d54c581b75a7ed90484a5bda424c12287dea94aad342b043bbcedbbc2807a614"} Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.684155 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6crtc" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.735922 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle\") pod \"81e875d4-1d9d-4245-90fc-900eae487b29\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.736067 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data\") pod \"81e875d4-1d9d-4245-90fc-900eae487b29\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.736163 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vztwj\" (UniqueName: \"kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj\") pod \"81e875d4-1d9d-4245-90fc-900eae487b29\" (UID: \"81e875d4-1d9d-4245-90fc-900eae487b29\") " Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.742543 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj" (OuterVolumeSpecName: "kube-api-access-vztwj") pod "81e875d4-1d9d-4245-90fc-900eae487b29" (UID: "81e875d4-1d9d-4245-90fc-900eae487b29"). InnerVolumeSpecName "kube-api-access-vztwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.744826 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "81e875d4-1d9d-4245-90fc-900eae487b29" (UID: "81e875d4-1d9d-4245-90fc-900eae487b29"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.759024 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e875d4-1d9d-4245-90fc-900eae487b29" (UID: "81e875d4-1d9d-4245-90fc-900eae487b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.837786 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.837824 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vztwj\" (UniqueName: \"kubernetes.io/projected/81e875d4-1d9d-4245-90fc-900eae487b29-kube-api-access-vztwj\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:01 crc kubenswrapper[4747]: I1205 22:14:01.837834 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e875d4-1d9d-4245-90fc-900eae487b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.330257 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6crtc" event={"ID":"81e875d4-1d9d-4245-90fc-900eae487b29","Type":"ContainerDied","Data":"4d21cca9bd17dd48067e1bdc074a46026d5ead4391f3ca36e10727294fb1091c"} Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.330322 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d21cca9bd17dd48067e1bdc074a46026d5ead4391f3ca36e10727294fb1091c" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.330337 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6crtc" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.613315 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-849c4659df-skj5z"] Dec 05 22:14:02 crc kubenswrapper[4747]: E1205 22:14:02.613923 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e875d4-1d9d-4245-90fc-900eae487b29" containerName="barbican-db-sync" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.613944 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e875d4-1d9d-4245-90fc-900eae487b29" containerName="barbican-db-sync" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.614165 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e875d4-1d9d-4245-90fc-900eae487b29" containerName="barbican-db-sync" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.615164 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.622534 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.625593 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.625806 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mfdm6" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.635198 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-849c4659df-skj5z"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.653410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.653459 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86j5\" (UniqueName: \"kubernetes.io/projected/a905f2c0-0f3d-41ed-b796-caff00a1a314-kube-api-access-s86j5\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.653489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a905f2c0-0f3d-41ed-b796-caff00a1a314-logs\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.653575 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data-custom\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.656312 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-combined-ca-bundle\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.702821 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771061 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771217 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data-custom\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771263 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-combined-ca-bundle\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771349 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771365 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86j5\" (UniqueName: \"kubernetes.io/projected/a905f2c0-0f3d-41ed-b796-caff00a1a314-kube-api-access-s86j5\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.771384 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a905f2c0-0f3d-41ed-b796-caff00a1a314-logs\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.772925 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a905f2c0-0f3d-41ed-b796-caff00a1a314-logs\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.773865 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.775461 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.782266 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-combined-ca-bundle\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.784675 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data-custom\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.785772 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a905f2c0-0f3d-41ed-b796-caff00a1a314-config-data\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.804850 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86j5\" (UniqueName: \"kubernetes.io/projected/a905f2c0-0f3d-41ed-b796-caff00a1a314-kube-api-access-s86j5\") pod \"barbican-worker-849c4659df-skj5z\" (UID: \"a905f2c0-0f3d-41ed-b796-caff00a1a314\") " pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.828194 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.829682 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.839493 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.858612 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.860076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.862284 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.867658 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.872438 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.872741 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88k98\" (UniqueName: \"kubernetes.io/projected/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-kube-api-access-88k98\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.875887 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876008 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876109 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data-custom\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876184 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-logs\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876256 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lzs\" (UniqueName: \"kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876665 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876814 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2rp\" (UniqueName: \"kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.876969 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.877228 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.943984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-849c4659df-skj5z" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983638 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983708 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88k98\" (UniqueName: \"kubernetes.io/projected/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-kube-api-access-88k98\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983818 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983852 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data-custom\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-logs\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983927 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.983967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lzs\" (UniqueName: \"kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.984008 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.984040 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.984063 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.984088 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2rp\" (UniqueName: \"kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.984875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-logs\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.985611 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.990515 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.990906 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.991442 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.995039 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.998125 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.998370 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:02 crc kubenswrapper[4747]: I1205 22:14:02.999312 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data-custom\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.002990 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.005657 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-config-data\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.006241 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.016942 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2rp\" (UniqueName: \"kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp\") pod \"dnsmasq-dns-6f5bdf6dc5-fmqpj\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.030748 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lzs\" (UniqueName: \"kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs\") pod \"barbican-api-548c587db8-8qnvf\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.038261 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88k98\" (UniqueName: \"kubernetes.io/projected/93fb10f2-a4d6-44c7-b14a-7fd3275f9f87-kube-api-access-88k98\") pod \"barbican-keystone-listener-6b6cffb4d8-fhwxv\" (UID: \"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87\") " pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.196721 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.210306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.239147 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.579284 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-849c4659df-skj5z"] Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.695905 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:03 crc kubenswrapper[4747]: W1205 22:14:03.699185 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcced490b_938d_45dd_a1fb_3e4f05c0bc83.slice/crio-abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d WatchSource:0}: Error finding container abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d: Status 404 returned error can't find the container with id abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.792145 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv"] Dec 05 22:14:03 crc kubenswrapper[4747]: I1205 22:14:03.810607 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.354970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-849c4659df-skj5z" event={"ID":"a905f2c0-0f3d-41ed-b796-caff00a1a314","Type":"ContainerStarted","Data":"1587eddae7fffe50ad587a70bbcf162e252f9665baff8c40d2e6376028ca0067"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.355020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-849c4659df-skj5z" event={"ID":"a905f2c0-0f3d-41ed-b796-caff00a1a314","Type":"ContainerStarted","Data":"da63e58acad104b18a4cadf14fb4199f6036fc4381d7a63dc1084330062a59b2"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.355034 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-849c4659df-skj5z" event={"ID":"a905f2c0-0f3d-41ed-b796-caff00a1a314","Type":"ContainerStarted","Data":"ae747f59c4b997b3df052438c21e6f02e2fc2f161a163b4620526b8fed154286"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.358158 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerStarted","Data":"aab2f19202a8ddf069010b41e0a85b3f339b295c5aa90862c8cdb5dde4caa6e2"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.358206 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerStarted","Data":"cc07ffae6776d676387318ac14f879962f03ddcbc52727166f5ada2d79685c67"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.358218 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerStarted","Data":"a06c24faa79ab67b3b785131595bf32bf8a4000f41a7316646d6c7485b1c5d59"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.358362 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.358465 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.364406 4747 generic.go:334] "Generic (PLEG): container finished" podID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerID="72f01a8ac3607ccbfef0762575b1e5d91a50cf7714be2957c57483989947a0d0" exitCode=0 Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.364484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" event={"ID":"cced490b-938d-45dd-a1fb-3e4f05c0bc83","Type":"ContainerDied","Data":"72f01a8ac3607ccbfef0762575b1e5d91a50cf7714be2957c57483989947a0d0"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.364516 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" event={"ID":"cced490b-938d-45dd-a1fb-3e4f05c0bc83","Type":"ContainerStarted","Data":"abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.385439 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-849c4659df-skj5z" podStartSLOduration=2.385421961 podStartE2EDuration="2.385421961s" podCreationTimestamp="2025-12-05 22:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:04.378950469 +0000 UTC m=+5514.846257967" watchObservedRunningTime="2025-12-05 22:14:04.385421961 +0000 UTC m=+5514.852729449" Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.394602 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" event={"ID":"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87","Type":"ContainerStarted","Data":"880d9179ecc7bfc790e2269ff4e69a4467201c7d2da09babca7115310eba2a94"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.394642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" event={"ID":"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87","Type":"ContainerStarted","Data":"f6746fcd4f898c17159f82a68bf582fab75117bb5a4432fe661b372db622500b"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.394652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" event={"ID":"93fb10f2-a4d6-44c7-b14a-7fd3275f9f87","Type":"ContainerStarted","Data":"f298c072650da6e3e0c31ce103c114e9910e5b073df9dd86075f2647bcee9143"} Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.481472 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-548c587db8-8qnvf" podStartSLOduration=2.481453519 podStartE2EDuration="2.481453519s" podCreationTimestamp="2025-12-05 22:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:04.476375962 +0000 UTC m=+5514.943683470" watchObservedRunningTime="2025-12-05 22:14:04.481453519 +0000 UTC m=+5514.948761007" Dec 05 22:14:04 crc kubenswrapper[4747]: I1205 22:14:04.508727 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b6cffb4d8-fhwxv" podStartSLOduration=2.5087040800000002 podStartE2EDuration="2.50870408s" podCreationTimestamp="2025-12-05 22:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:04.498995897 +0000 UTC m=+5514.966303405" watchObservedRunningTime="2025-12-05 22:14:04.50870408 +0000 UTC m=+5514.976011568" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.077757 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b8cc96f6-lbl9z"] Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.080076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.082498 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.082775 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.097730 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b8cc96f6-lbl9z"] Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149569 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-combined-ca-bundle\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149665 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-internal-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149689 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data-custom\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149863 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5xg\" (UniqueName: \"kubernetes.io/projected/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-kube-api-access-pb5xg\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149904 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-public-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.149975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-logs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5xg\" (UniqueName: \"kubernetes.io/projected/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-kube-api-access-pb5xg\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251834 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-public-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251872 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-logs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-combined-ca-bundle\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-internal-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.251984 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data-custom\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.252671 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-logs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.257744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data-custom\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.258403 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-config-data\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.263173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-internal-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.268822 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-public-tls-certs\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.270905 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5xg\" (UniqueName: \"kubernetes.io/projected/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-kube-api-access-pb5xg\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.272018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7fceb8-9ede-4165-88c0-beaa3f74c0b2-combined-ca-bundle\") pod \"barbican-api-7b8cc96f6-lbl9z\" (UID: \"db7fceb8-9ede-4165-88c0-beaa3f74c0b2\") " pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.403658 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" event={"ID":"cced490b-938d-45dd-a1fb-3e4f05c0bc83","Type":"ContainerStarted","Data":"53828885502e82f68e133a6834df84c912a0c686e9fcffd89679f461f70198a6"} Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.404451 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.417626 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.427359 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" podStartSLOduration=3.427343685 podStartE2EDuration="3.427343685s" podCreationTimestamp="2025-12-05 22:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:05.423922379 +0000 UTC m=+5515.891229867" watchObservedRunningTime="2025-12-05 22:14:05.427343685 +0000 UTC m=+5515.894651173" Dec 05 22:14:05 crc kubenswrapper[4747]: I1205 22:14:05.903626 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b8cc96f6-lbl9z"] Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.222187 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.222272 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.222324 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.223067 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.223129 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d" gracePeriod=600 Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.420535 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc96f6-lbl9z" event={"ID":"db7fceb8-9ede-4165-88c0-beaa3f74c0b2","Type":"ContainerStarted","Data":"4b2083613de470df9b7424682d3b86781ea0c9afa6ddac081a45254a8a16168f"} Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.420945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc96f6-lbl9z" event={"ID":"db7fceb8-9ede-4165-88c0-beaa3f74c0b2","Type":"ContainerStarted","Data":"4a55ad360c33b6ed2dafaa631a024bf8681e432bf0a87d7d5fd0627d8e2fcbe8"} Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.420963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc96f6-lbl9z" event={"ID":"db7fceb8-9ede-4165-88c0-beaa3f74c0b2","Type":"ContainerStarted","Data":"8734c2be9324e39c9335bb0d42d1e70f9625e7e3a41565cdb7f8f822d7839bce"} Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.421021 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.421045 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.424505 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d" exitCode=0 Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.424655 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d"} Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.424712 4747 scope.go:117] "RemoveContainer" containerID="c260720386572090d72dd1fd3fc3052cc3ef45063af152d585d30a8c2c774804" Dec 05 22:14:06 crc kubenswrapper[4747]: I1205 22:14:06.440839 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b8cc96f6-lbl9z" podStartSLOduration=1.440819887 podStartE2EDuration="1.440819887s" podCreationTimestamp="2025-12-05 22:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:06.440096239 +0000 UTC m=+5516.907403727" watchObservedRunningTime="2025-12-05 22:14:06.440819887 +0000 UTC m=+5516.908127385" Dec 05 22:14:07 crc kubenswrapper[4747]: I1205 22:14:07.434674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001"} Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.212772 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.299052 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.299284 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="dnsmasq-dns" containerID="cri-o://0043b254c9707faa81363b30e91989db2c3b151d82a61880d5fec69870e71294" gracePeriod=10 Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.493439 4747 generic.go:334] "Generic (PLEG): container finished" podID="87950a32-a688-49e7-ab86-7057f156e114" containerID="0043b254c9707faa81363b30e91989db2c3b151d82a61880d5fec69870e71294" exitCode=0 Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.493779 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" event={"ID":"87950a32-a688-49e7-ab86-7057f156e114","Type":"ContainerDied","Data":"0043b254c9707faa81363b30e91989db2c3b151d82a61880d5fec69870e71294"} Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.772348 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.808560 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb\") pod \"87950a32-a688-49e7-ab86-7057f156e114\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.808894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtfv\" (UniqueName: \"kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv\") pod \"87950a32-a688-49e7-ab86-7057f156e114\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.809039 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc\") pod \"87950a32-a688-49e7-ab86-7057f156e114\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.809175 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config\") pod \"87950a32-a688-49e7-ab86-7057f156e114\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.809321 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb\") pod \"87950a32-a688-49e7-ab86-7057f156e114\" (UID: \"87950a32-a688-49e7-ab86-7057f156e114\") " Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.842416 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv" (OuterVolumeSpecName: "kube-api-access-gjtfv") pod "87950a32-a688-49e7-ab86-7057f156e114" (UID: "87950a32-a688-49e7-ab86-7057f156e114"). InnerVolumeSpecName "kube-api-access-gjtfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.893922 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87950a32-a688-49e7-ab86-7057f156e114" (UID: "87950a32-a688-49e7-ab86-7057f156e114"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.901861 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87950a32-a688-49e7-ab86-7057f156e114" (UID: "87950a32-a688-49e7-ab86-7057f156e114"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.911261 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.911294 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.911307 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtfv\" (UniqueName: \"kubernetes.io/projected/87950a32-a688-49e7-ab86-7057f156e114-kube-api-access-gjtfv\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.913623 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config" (OuterVolumeSpecName: "config") pod "87950a32-a688-49e7-ab86-7057f156e114" (UID: "87950a32-a688-49e7-ab86-7057f156e114"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:13 crc kubenswrapper[4747]: I1205 22:14:13.914115 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87950a32-a688-49e7-ab86-7057f156e114" (UID: "87950a32-a688-49e7-ab86-7057f156e114"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.012577 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.012629 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87950a32-a688-49e7-ab86-7057f156e114-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.505283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" event={"ID":"87950a32-a688-49e7-ab86-7057f156e114","Type":"ContainerDied","Data":"034692be94372df2a43a288de1d4ecf3d89b345f9e90ebeaf07d42182210327e"} Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.505630 4747 scope.go:117] "RemoveContainer" containerID="0043b254c9707faa81363b30e91989db2c3b151d82a61880d5fec69870e71294" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.505441 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756477f68c-vr5gt" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.527454 4747 scope.go:117] "RemoveContainer" containerID="6f0585d4ef7cf485c90752abeceddf0155acef00ddba985c84b90719e504ffb6" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.558097 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.568068 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-756477f68c-vr5gt"] Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.646640 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:14 crc kubenswrapper[4747]: I1205 22:14:14.870497 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:15 crc kubenswrapper[4747]: I1205 22:14:15.851431 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87950a32-a688-49e7-ab86-7057f156e114" path="/var/lib/kubelet/pods/87950a32-a688-49e7-ab86-7057f156e114/volumes" Dec 05 22:14:16 crc kubenswrapper[4747]: I1205 22:14:16.812920 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:16 crc kubenswrapper[4747]: I1205 22:14:16.850021 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b8cc96f6-lbl9z" Dec 05 22:14:16 crc kubenswrapper[4747]: I1205 22:14:16.939084 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:16 crc kubenswrapper[4747]: I1205 22:14:16.939359 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-548c587db8-8qnvf" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api-log" containerID="cri-o://cc07ffae6776d676387318ac14f879962f03ddcbc52727166f5ada2d79685c67" gracePeriod=30 Dec 05 22:14:16 crc kubenswrapper[4747]: I1205 22:14:16.939524 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-548c587db8-8qnvf" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api" containerID="cri-o://aab2f19202a8ddf069010b41e0a85b3f339b295c5aa90862c8cdb5dde4caa6e2" gracePeriod=30 Dec 05 22:14:17 crc kubenswrapper[4747]: I1205 22:14:17.541287 4747 generic.go:334] "Generic (PLEG): container finished" podID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerID="cc07ffae6776d676387318ac14f879962f03ddcbc52727166f5ada2d79685c67" exitCode=143 Dec 05 22:14:17 crc kubenswrapper[4747]: I1205 22:14:17.541367 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerDied","Data":"cc07ffae6776d676387318ac14f879962f03ddcbc52727166f5ada2d79685c67"} Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.094203 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-548c587db8-8qnvf" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.30:9311/healthcheck\": read tcp 10.217.0.2:55660->10.217.1.30:9311: read: connection reset by peer" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.095986 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-548c587db8-8qnvf" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.30:9311/healthcheck\": read tcp 10.217.0.2:55666->10.217.1.30:9311: read: connection reset by peer" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.577072 4747 generic.go:334] "Generic (PLEG): container finished" podID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerID="aab2f19202a8ddf069010b41e0a85b3f339b295c5aa90862c8cdb5dde4caa6e2" exitCode=0 Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.577154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerDied","Data":"aab2f19202a8ddf069010b41e0a85b3f339b295c5aa90862c8cdb5dde4caa6e2"} Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.577395 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548c587db8-8qnvf" event={"ID":"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9","Type":"ContainerDied","Data":"a06c24faa79ab67b3b785131595bf32bf8a4000f41a7316646d6c7485b1c5d59"} Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.577411 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a06c24faa79ab67b3b785131595bf32bf8a4000f41a7316646d6c7485b1c5d59" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.617878 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.773801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data\") pod \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.773892 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle\") pod \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.773932 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom\") pod \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.774027 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lzs\" (UniqueName: \"kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs\") pod \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.774071 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs\") pod \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\" (UID: \"2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9\") " Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.775824 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs" (OuterVolumeSpecName: "logs") pod "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" (UID: "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.779961 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" (UID: "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.784785 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs" (OuterVolumeSpecName: "kube-api-access-29lzs") pod "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" (UID: "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9"). InnerVolumeSpecName "kube-api-access-29lzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.810552 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" (UID: "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.830800 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data" (OuterVolumeSpecName: "config-data") pod "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" (UID: "2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.875658 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.875698 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.875714 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.875727 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lzs\" (UniqueName: \"kubernetes.io/projected/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-kube-api-access-29lzs\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:20 crc kubenswrapper[4747]: I1205 22:14:20.875740 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:21 crc kubenswrapper[4747]: I1205 22:14:21.584356 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548c587db8-8qnvf" Dec 05 22:14:21 crc kubenswrapper[4747]: I1205 22:14:21.619384 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:21 crc kubenswrapper[4747]: I1205 22:14:21.628493 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-548c587db8-8qnvf"] Dec 05 22:14:21 crc kubenswrapper[4747]: I1205 22:14:21.898259 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" path="/var/lib/kubelet/pods/2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9/volumes" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930232 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d25xx"] Dec 05 22:14:23 crc kubenswrapper[4747]: E1205 22:14:23.930767 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="dnsmasq-dns" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930790 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="dnsmasq-dns" Dec 05 22:14:23 crc kubenswrapper[4747]: E1205 22:14:23.930802 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930808 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api" Dec 05 22:14:23 crc kubenswrapper[4747]: E1205 22:14:23.930837 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="init" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930844 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="init" Dec 05 22:14:23 crc kubenswrapper[4747]: E1205 22:14:23.930856 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api-log" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930862 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api-log" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.930996 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api-log" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.931006 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="87950a32-a688-49e7-ab86-7057f156e114" containerName="dnsmasq-dns" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.931017 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6c20d8-e3a3-4f1e-86b3-5a5b869f52c9" containerName="barbican-api" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.931533 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:23 crc kubenswrapper[4747]: I1205 22:14:23.948239 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d25xx"] Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.032833 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnd6g\" (UniqueName: \"kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.033249 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.037653 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-38fa-account-create-update-s2gtp"] Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.038621 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.040675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.059335 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-38fa-account-create-update-s2gtp"] Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.134433 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnd6g\" (UniqueName: \"kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.135193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.135863 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.153363 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnd6g\" (UniqueName: \"kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g\") pod \"neutron-db-create-d25xx\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.236231 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.236336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9nrh\" (UniqueName: \"kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.250764 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.338632 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9nrh\" (UniqueName: \"kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.338957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.339739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.370167 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9nrh\" (UniqueName: \"kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh\") pod \"neutron-38fa-account-create-update-s2gtp\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.652732 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:24 crc kubenswrapper[4747]: I1205 22:14:24.698259 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d25xx"] Dec 05 22:14:24 crc kubenswrapper[4747]: W1205 22:14:24.713925 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cdf17de_526f_457d_a16f_ef36b315ba94.slice/crio-f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049 WatchSource:0}: Error finding container f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049: Status 404 returned error can't find the container with id f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049 Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.235075 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-38fa-account-create-update-s2gtp"] Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.617883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-38fa-account-create-update-s2gtp" event={"ID":"41e21ba2-529c-4cca-8819-a8b6e5cdbc05","Type":"ContainerStarted","Data":"a6a8d606172a1a1a6c25703dab919a8371ff2cdfbb8828a23f921253b9f04bcc"} Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.618211 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-38fa-account-create-update-s2gtp" event={"ID":"41e21ba2-529c-4cca-8819-a8b6e5cdbc05","Type":"ContainerStarted","Data":"bcb88be55fc0af2ce72368faffb47045cc133ea03b0ec6d838812f49bdc93919"} Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.619618 4747 generic.go:334] "Generic (PLEG): container finished" podID="5cdf17de-526f-457d-a16f-ef36b315ba94" containerID="0f82f9613d3295dc737704166f084fa237a78f0d9d66a01cfbbc7798e159d966" exitCode=0 Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.619653 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d25xx" event={"ID":"5cdf17de-526f-457d-a16f-ef36b315ba94","Type":"ContainerDied","Data":"0f82f9613d3295dc737704166f084fa237a78f0d9d66a01cfbbc7798e159d966"} Dec 05 22:14:25 crc kubenswrapper[4747]: I1205 22:14:25.619674 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d25xx" event={"ID":"5cdf17de-526f-457d-a16f-ef36b315ba94","Type":"ContainerStarted","Data":"f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049"} Dec 05 22:14:26 crc kubenswrapper[4747]: I1205 22:14:26.637892 4747 generic.go:334] "Generic (PLEG): container finished" podID="41e21ba2-529c-4cca-8819-a8b6e5cdbc05" containerID="a6a8d606172a1a1a6c25703dab919a8371ff2cdfbb8828a23f921253b9f04bcc" exitCode=0 Dec 05 22:14:26 crc kubenswrapper[4747]: I1205 22:14:26.638018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-38fa-account-create-update-s2gtp" event={"ID":"41e21ba2-529c-4cca-8819-a8b6e5cdbc05","Type":"ContainerDied","Data":"a6a8d606172a1a1a6c25703dab919a8371ff2cdfbb8828a23f921253b9f04bcc"} Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.063209 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.068726 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.102446 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnd6g\" (UniqueName: \"kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g\") pod \"5cdf17de-526f-457d-a16f-ef36b315ba94\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.102544 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts\") pod \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.102613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts\") pod \"5cdf17de-526f-457d-a16f-ef36b315ba94\" (UID: \"5cdf17de-526f-457d-a16f-ef36b315ba94\") " Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.102650 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9nrh\" (UniqueName: \"kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh\") pod \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\" (UID: \"41e21ba2-529c-4cca-8819-a8b6e5cdbc05\") " Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.103363 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cdf17de-526f-457d-a16f-ef36b315ba94" (UID: "5cdf17de-526f-457d-a16f-ef36b315ba94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.103389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41e21ba2-529c-4cca-8819-a8b6e5cdbc05" (UID: "41e21ba2-529c-4cca-8819-a8b6e5cdbc05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.108685 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g" (OuterVolumeSpecName: "kube-api-access-jnd6g") pod "5cdf17de-526f-457d-a16f-ef36b315ba94" (UID: "5cdf17de-526f-457d-a16f-ef36b315ba94"). InnerVolumeSpecName "kube-api-access-jnd6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.108839 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh" (OuterVolumeSpecName: "kube-api-access-f9nrh") pod "41e21ba2-529c-4cca-8819-a8b6e5cdbc05" (UID: "41e21ba2-529c-4cca-8819-a8b6e5cdbc05"). InnerVolumeSpecName "kube-api-access-f9nrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.204893 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cdf17de-526f-457d-a16f-ef36b315ba94-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.204945 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9nrh\" (UniqueName: \"kubernetes.io/projected/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-kube-api-access-f9nrh\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.204966 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnd6g\" (UniqueName: \"kubernetes.io/projected/5cdf17de-526f-457d-a16f-ef36b315ba94-kube-api-access-jnd6g\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.204986 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e21ba2-529c-4cca-8819-a8b6e5cdbc05-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.652376 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-38fa-account-create-update-s2gtp" event={"ID":"41e21ba2-529c-4cca-8819-a8b6e5cdbc05","Type":"ContainerDied","Data":"bcb88be55fc0af2ce72368faffb47045cc133ea03b0ec6d838812f49bdc93919"} Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.652451 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb88be55fc0af2ce72368faffb47045cc133ea03b0ec6d838812f49bdc93919" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.652407 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-38fa-account-create-update-s2gtp" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.655166 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d25xx" event={"ID":"5cdf17de-526f-457d-a16f-ef36b315ba94","Type":"ContainerDied","Data":"f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049"} Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.655234 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f147dc843992f42d46a25fe53e193f8e1bff2a32a1989255186039d04e38f049" Dec 05 22:14:27 crc kubenswrapper[4747]: I1205 22:14:27.655276 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d25xx" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.330064 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-skw6n"] Dec 05 22:14:29 crc kubenswrapper[4747]: E1205 22:14:29.330791 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdf17de-526f-457d-a16f-ef36b315ba94" containerName="mariadb-database-create" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.330808 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdf17de-526f-457d-a16f-ef36b315ba94" containerName="mariadb-database-create" Dec 05 22:14:29 crc kubenswrapper[4747]: E1205 22:14:29.330821 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e21ba2-529c-4cca-8819-a8b6e5cdbc05" containerName="mariadb-account-create-update" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.330829 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e21ba2-529c-4cca-8819-a8b6e5cdbc05" containerName="mariadb-account-create-update" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.331041 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdf17de-526f-457d-a16f-ef36b315ba94" containerName="mariadb-database-create" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.331072 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e21ba2-529c-4cca-8819-a8b6e5cdbc05" containerName="mariadb-account-create-update" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.331720 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.334688 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.334861 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.335412 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8f5vq" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.340051 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.340165 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.340246 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkx9x\" (UniqueName: \"kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.343978 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-skw6n"] Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.441684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.441751 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkx9x\" (UniqueName: \"kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.441817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.446783 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.459347 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.469121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkx9x\" (UniqueName: \"kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x\") pod \"neutron-db-sync-skw6n\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:29 crc kubenswrapper[4747]: I1205 22:14:29.651920 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:30 crc kubenswrapper[4747]: I1205 22:14:30.101690 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-skw6n"] Dec 05 22:14:30 crc kubenswrapper[4747]: I1205 22:14:30.689690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-skw6n" event={"ID":"73166647-5821-41a1-a727-2d22d24927c2","Type":"ContainerStarted","Data":"ab2315f8f216b1fb0c47a6a2abbb6e29c6bc0821bc4cf99ae67abcf4023ef9ab"} Dec 05 22:14:30 crc kubenswrapper[4747]: I1205 22:14:30.690049 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-skw6n" event={"ID":"73166647-5821-41a1-a727-2d22d24927c2","Type":"ContainerStarted","Data":"cdff5de4950a6362b11da2a4e6ae406fc04695b3d4dd9bba8d95f1dd343e2ee1"} Dec 05 22:14:30 crc kubenswrapper[4747]: I1205 22:14:30.710291 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-skw6n" podStartSLOduration=1.710249706 podStartE2EDuration="1.710249706s" podCreationTimestamp="2025-12-05 22:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:30.706679987 +0000 UTC m=+5541.173987515" watchObservedRunningTime="2025-12-05 22:14:30.710249706 +0000 UTC m=+5541.177557234" Dec 05 22:14:34 crc kubenswrapper[4747]: I1205 22:14:34.733852 4747 generic.go:334] "Generic (PLEG): container finished" podID="73166647-5821-41a1-a727-2d22d24927c2" containerID="ab2315f8f216b1fb0c47a6a2abbb6e29c6bc0821bc4cf99ae67abcf4023ef9ab" exitCode=0 Dec 05 22:14:34 crc kubenswrapper[4747]: I1205 22:14:34.733954 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-skw6n" event={"ID":"73166647-5821-41a1-a727-2d22d24927c2","Type":"ContainerDied","Data":"ab2315f8f216b1fb0c47a6a2abbb6e29c6bc0821bc4cf99ae67abcf4023ef9ab"} Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.121872 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.268854 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle\") pod \"73166647-5821-41a1-a727-2d22d24927c2\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.269149 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config\") pod \"73166647-5821-41a1-a727-2d22d24927c2\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.269185 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkx9x\" (UniqueName: \"kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x\") pod \"73166647-5821-41a1-a727-2d22d24927c2\" (UID: \"73166647-5821-41a1-a727-2d22d24927c2\") " Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.274441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x" (OuterVolumeSpecName: "kube-api-access-nkx9x") pod "73166647-5821-41a1-a727-2d22d24927c2" (UID: "73166647-5821-41a1-a727-2d22d24927c2"). InnerVolumeSpecName "kube-api-access-nkx9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.308995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config" (OuterVolumeSpecName: "config") pod "73166647-5821-41a1-a727-2d22d24927c2" (UID: "73166647-5821-41a1-a727-2d22d24927c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.318102 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73166647-5821-41a1-a727-2d22d24927c2" (UID: "73166647-5821-41a1-a727-2d22d24927c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.372970 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.373051 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73166647-5821-41a1-a727-2d22d24927c2-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.373073 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkx9x\" (UniqueName: \"kubernetes.io/projected/73166647-5821-41a1-a727-2d22d24927c2-kube-api-access-nkx9x\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.770474 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-skw6n" event={"ID":"73166647-5821-41a1-a727-2d22d24927c2","Type":"ContainerDied","Data":"cdff5de4950a6362b11da2a4e6ae406fc04695b3d4dd9bba8d95f1dd343e2ee1"} Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.770539 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdff5de4950a6362b11da2a4e6ae406fc04695b3d4dd9bba8d95f1dd343e2ee1" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.770689 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-skw6n" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.928197 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:14:36 crc kubenswrapper[4747]: E1205 22:14:36.928573 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73166647-5821-41a1-a727-2d22d24927c2" containerName="neutron-db-sync" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.928612 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="73166647-5821-41a1-a727-2d22d24927c2" containerName="neutron-db-sync" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.928800 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="73166647-5821-41a1-a727-2d22d24927c2" containerName="neutron-db-sync" Dec 05 22:14:36 crc kubenswrapper[4747]: I1205 22:14:36.929864 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.032760 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.041536 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.043160 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.049093 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.049857 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8f5vq" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.050009 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.050143 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.053385 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.089859 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.089958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.090009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcft\" (UniqueName: \"kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.090029 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.090074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192065 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192114 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192190 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192327 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcft\" (UniqueName: \"kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192350 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192397 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192415 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7cm9\" (UniqueName: \"kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192449 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.192535 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.193212 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.193273 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.193374 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.193617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.210204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcft\" (UniqueName: \"kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft\") pod \"dnsmasq-dns-84b95ffff5-lb4gm\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.281984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.294714 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.294881 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.294922 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7cm9\" (UniqueName: \"kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.295032 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.295103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.300737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.300901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.301928 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.304880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.321917 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7cm9\" (UniqueName: \"kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9\") pod \"neutron-6c6b4c965b-fzknc\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.370442 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.767132 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.792655 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" event={"ID":"b15d3c51-cae8-4777-b7b9-5bc60f05464c","Type":"ContainerStarted","Data":"4247cba22a8d54a21f5f9e05d706fceae1b13a5a5a2f5baeb48bb309f555bacb"} Dec 05 22:14:37 crc kubenswrapper[4747]: I1205 22:14:37.946146 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.800999 4747 generic.go:334] "Generic (PLEG): container finished" podID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerID="7b9f54ed1200f635e86f9401470c6725b9b9f92e2ca4bcb7fa63b08e31476cd2" exitCode=0 Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.801053 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" event={"ID":"b15d3c51-cae8-4777-b7b9-5bc60f05464c","Type":"ContainerDied","Data":"7b9f54ed1200f635e86f9401470c6725b9b9f92e2ca4bcb7fa63b08e31476cd2"} Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.804811 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerStarted","Data":"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59"} Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.804851 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerStarted","Data":"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b"} Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.804866 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerStarted","Data":"74bb99f62262e939cdd4d57f0b254d7d5b7739d77733370140c3b98ac55875ea"} Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.805779 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:14:38 crc kubenswrapper[4747]: I1205 22:14:38.867035 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c6b4c965b-fzknc" podStartSLOduration=1.867013854 podStartE2EDuration="1.867013854s" podCreationTimestamp="2025-12-05 22:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:38.859436775 +0000 UTC m=+5549.326744273" watchObservedRunningTime="2025-12-05 22:14:38.867013854 +0000 UTC m=+5549.334321352" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.388268 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7db666fd57-25ch7"] Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.390825 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.393542 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.393596 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.412494 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7db666fd57-25ch7"] Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.531939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-combined-ca-bundle\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.531991 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-internal-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.532017 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbgkm\" (UniqueName: \"kubernetes.io/projected/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-kube-api-access-bbgkm\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.532088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-ovndb-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.532113 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-public-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.532142 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.532191 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-httpd-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbgkm\" (UniqueName: \"kubernetes.io/projected/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-kube-api-access-bbgkm\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633669 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-ovndb-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633718 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-public-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633748 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-httpd-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-combined-ca-bundle\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.633873 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-internal-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.641156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-internal-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.642121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-httpd-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.643196 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-ovndb-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.643414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-combined-ca-bundle\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.643434 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-config\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.649264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-public-tls-certs\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.652543 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbgkm\" (UniqueName: \"kubernetes.io/projected/5f76856c-0d34-4880-9baa-0fd3b0dc3a36-kube-api-access-bbgkm\") pod \"neutron-7db666fd57-25ch7\" (UID: \"5f76856c-0d34-4880-9baa-0fd3b0dc3a36\") " pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.726109 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.817006 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" event={"ID":"b15d3c51-cae8-4777-b7b9-5bc60f05464c","Type":"ContainerStarted","Data":"6abb455833bd4eea7b3d65dcec40a2c0f876caacbc21d99dbe49f17ffb2e2921"} Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.817623 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:39 crc kubenswrapper[4747]: I1205 22:14:39.848208 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" podStartSLOduration=3.848182401 podStartE2EDuration="3.848182401s" podCreationTimestamp="2025-12-05 22:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:39.837529444 +0000 UTC m=+5550.304836932" watchObservedRunningTime="2025-12-05 22:14:39.848182401 +0000 UTC m=+5550.315489889" Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.256922 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7db666fd57-25ch7"] Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.828174 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db666fd57-25ch7" event={"ID":"5f76856c-0d34-4880-9baa-0fd3b0dc3a36","Type":"ContainerStarted","Data":"b6c2e58fe4b3d51a68333ef8da21da87db552ae9a4999876147b096520ac02bb"} Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.828216 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db666fd57-25ch7" event={"ID":"5f76856c-0d34-4880-9baa-0fd3b0dc3a36","Type":"ContainerStarted","Data":"249776d79d12a4ace49726347ee9a8124941d8bf4cb8bd717ea8dddee100c51f"} Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.828229 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db666fd57-25ch7" event={"ID":"5f76856c-0d34-4880-9baa-0fd3b0dc3a36","Type":"ContainerStarted","Data":"8f13e65c81d205199541e21af5e0c53d428560fb7a2101e8e2481b80207a6907"} Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.828767 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:14:40 crc kubenswrapper[4747]: I1205 22:14:40.853421 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7db666fd57-25ch7" podStartSLOduration=1.853402888 podStartE2EDuration="1.853402888s" podCreationTimestamp="2025-12-05 22:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:14:40.848028064 +0000 UTC m=+5551.315335562" watchObservedRunningTime="2025-12-05 22:14:40.853402888 +0000 UTC m=+5551.320710376" Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.283809 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.371187 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.371486 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="dnsmasq-dns" containerID="cri-o://53828885502e82f68e133a6834df84c912a0c686e9fcffd89679f461f70198a6" gracePeriod=10 Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.893185 4747 generic.go:334] "Generic (PLEG): container finished" podID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerID="53828885502e82f68e133a6834df84c912a0c686e9fcffd89679f461f70198a6" exitCode=0 Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.893299 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" event={"ID":"cced490b-938d-45dd-a1fb-3e4f05c0bc83","Type":"ContainerDied","Data":"53828885502e82f68e133a6834df84c912a0c686e9fcffd89679f461f70198a6"} Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.893515 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" event={"ID":"cced490b-938d-45dd-a1fb-3e4f05c0bc83","Type":"ContainerDied","Data":"abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d"} Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.893530 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc2289277d4a3ca5d61bb0a3a48f9fdee9cd4c7fabea08573270ed2f3bf6c0d" Dec 05 22:14:47 crc kubenswrapper[4747]: I1205 22:14:47.930909 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.100092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb\") pod \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.100481 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc\") pod \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.100630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb\") pod \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.100688 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2rp\" (UniqueName: \"kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp\") pod \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.101191 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config\") pod \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\" (UID: \"cced490b-938d-45dd-a1fb-3e4f05c0bc83\") " Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.135802 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp" (OuterVolumeSpecName: "kube-api-access-qp2rp") pod "cced490b-938d-45dd-a1fb-3e4f05c0bc83" (UID: "cced490b-938d-45dd-a1fb-3e4f05c0bc83"). InnerVolumeSpecName "kube-api-access-qp2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.163481 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cced490b-938d-45dd-a1fb-3e4f05c0bc83" (UID: "cced490b-938d-45dd-a1fb-3e4f05c0bc83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.168796 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cced490b-938d-45dd-a1fb-3e4f05c0bc83" (UID: "cced490b-938d-45dd-a1fb-3e4f05c0bc83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.181205 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cced490b-938d-45dd-a1fb-3e4f05c0bc83" (UID: "cced490b-938d-45dd-a1fb-3e4f05c0bc83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.182987 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config" (OuterVolumeSpecName: "config") pod "cced490b-938d-45dd-a1fb-3e4f05c0bc83" (UID: "cced490b-938d-45dd-a1fb-3e4f05c0bc83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.202917 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.202947 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2rp\" (UniqueName: \"kubernetes.io/projected/cced490b-938d-45dd-a1fb-3e4f05c0bc83-kube-api-access-qp2rp\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.202959 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.202969 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.202978 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cced490b-938d-45dd-a1fb-3e4f05c0bc83-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.902167 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj" Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.938262 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:48 crc kubenswrapper[4747]: I1205 22:14:48.947518 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f5bdf6dc5-fmqpj"] Dec 05 22:14:49 crc kubenswrapper[4747]: I1205 22:14:49.854143 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" path="/var/lib/kubelet/pods/cced490b-938d-45dd-a1fb-3e4f05c0bc83/volumes" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.183734 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg"] Dec 05 22:15:00 crc kubenswrapper[4747]: E1205 22:15:00.184404 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="dnsmasq-dns" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.184416 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="dnsmasq-dns" Dec 05 22:15:00 crc kubenswrapper[4747]: E1205 22:15:00.184436 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="init" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.184444 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="init" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.184639 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cced490b-938d-45dd-a1fb-3e4f05c0bc83" containerName="dnsmasq-dns" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.185211 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.189546 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.189961 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.201424 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg"] Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.267846 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lkm\" (UniqueName: \"kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.268212 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.268497 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.370231 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.370363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lkm\" (UniqueName: \"kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.370462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.372855 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.377100 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.393428 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lkm\" (UniqueName: \"kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm\") pod \"collect-profiles-29416215-sc9tg\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:00 crc kubenswrapper[4747]: I1205 22:15:00.519016 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:01 crc kubenswrapper[4747]: I1205 22:15:01.040406 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg"] Dec 05 22:15:02 crc kubenswrapper[4747]: I1205 22:15:02.017299 4747 generic.go:334] "Generic (PLEG): container finished" podID="e852ab93-76f8-4d63-83fd-527104dd25df" containerID="c72c42b9237c2a4676afef7888bc32df25cf4f82b9c184b879f95370e2bfec6c" exitCode=0 Dec 05 22:15:02 crc kubenswrapper[4747]: I1205 22:15:02.017385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" event={"ID":"e852ab93-76f8-4d63-83fd-527104dd25df","Type":"ContainerDied","Data":"c72c42b9237c2a4676afef7888bc32df25cf4f82b9c184b879f95370e2bfec6c"} Dec 05 22:15:02 crc kubenswrapper[4747]: I1205 22:15:02.017870 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" event={"ID":"e852ab93-76f8-4d63-83fd-527104dd25df","Type":"ContainerStarted","Data":"39852870d0071c5641b2f3c02be40a1a34773a938afbbc5636c55f1016da66b2"} Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.379026 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.531634 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume\") pod \"e852ab93-76f8-4d63-83fd-527104dd25df\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.531749 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2lkm\" (UniqueName: \"kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm\") pod \"e852ab93-76f8-4d63-83fd-527104dd25df\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.531822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume\") pod \"e852ab93-76f8-4d63-83fd-527104dd25df\" (UID: \"e852ab93-76f8-4d63-83fd-527104dd25df\") " Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.533409 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume" (OuterVolumeSpecName: "config-volume") pod "e852ab93-76f8-4d63-83fd-527104dd25df" (UID: "e852ab93-76f8-4d63-83fd-527104dd25df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.537324 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm" (OuterVolumeSpecName: "kube-api-access-c2lkm") pod "e852ab93-76f8-4d63-83fd-527104dd25df" (UID: "e852ab93-76f8-4d63-83fd-527104dd25df"). InnerVolumeSpecName "kube-api-access-c2lkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.537481 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e852ab93-76f8-4d63-83fd-527104dd25df" (UID: "e852ab93-76f8-4d63-83fd-527104dd25df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.634285 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e852ab93-76f8-4d63-83fd-527104dd25df-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.634336 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2lkm\" (UniqueName: \"kubernetes.io/projected/e852ab93-76f8-4d63-83fd-527104dd25df-kube-api-access-c2lkm\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:03 crc kubenswrapper[4747]: I1205 22:15:03.634356 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e852ab93-76f8-4d63-83fd-527104dd25df-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:04 crc kubenswrapper[4747]: I1205 22:15:04.036514 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" event={"ID":"e852ab93-76f8-4d63-83fd-527104dd25df","Type":"ContainerDied","Data":"39852870d0071c5641b2f3c02be40a1a34773a938afbbc5636c55f1016da66b2"} Dec 05 22:15:04 crc kubenswrapper[4747]: I1205 22:15:04.036623 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39852870d0071c5641b2f3c02be40a1a34773a938afbbc5636c55f1016da66b2" Dec 05 22:15:04 crc kubenswrapper[4747]: I1205 22:15:04.036651 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg" Dec 05 22:15:04 crc kubenswrapper[4747]: I1205 22:15:04.469362 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx"] Dec 05 22:15:04 crc kubenswrapper[4747]: I1205 22:15:04.479251 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416170-wm2cx"] Dec 05 22:15:05 crc kubenswrapper[4747]: I1205 22:15:05.858382 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79224c99-bda2-4fad-9546-96a275fd4329" path="/var/lib/kubelet/pods/79224c99-bda2-4fad-9546-96a275fd4329/volumes" Dec 05 22:15:07 crc kubenswrapper[4747]: I1205 22:15:07.381773 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:15:09 crc kubenswrapper[4747]: I1205 22:15:09.757548 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7db666fd57-25ch7" Dec 05 22:15:09 crc kubenswrapper[4747]: I1205 22:15:09.829776 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:15:09 crc kubenswrapper[4747]: I1205 22:15:09.830067 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c6b4c965b-fzknc" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-api" containerID="cri-o://7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b" gracePeriod=30 Dec 05 22:15:09 crc kubenswrapper[4747]: I1205 22:15:09.830446 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c6b4c965b-fzknc" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-httpd" containerID="cri-o://142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59" gracePeriod=30 Dec 05 22:15:10 crc kubenswrapper[4747]: I1205 22:15:10.100511 4747 generic.go:334] "Generic (PLEG): container finished" podID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerID="142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59" exitCode=0 Dec 05 22:15:10 crc kubenswrapper[4747]: I1205 22:15:10.100569 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerDied","Data":"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59"} Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.665432 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.817455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config\") pod \"6254d135-7fdc-4e6c-be4a-709233aa9e71\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.817530 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle\") pod \"6254d135-7fdc-4e6c-be4a-709233aa9e71\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.817554 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7cm9\" (UniqueName: \"kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9\") pod \"6254d135-7fdc-4e6c-be4a-709233aa9e71\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.817727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs\") pod \"6254d135-7fdc-4e6c-be4a-709233aa9e71\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.817851 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config\") pod \"6254d135-7fdc-4e6c-be4a-709233aa9e71\" (UID: \"6254d135-7fdc-4e6c-be4a-709233aa9e71\") " Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.823483 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9" (OuterVolumeSpecName: "kube-api-access-d7cm9") pod "6254d135-7fdc-4e6c-be4a-709233aa9e71" (UID: "6254d135-7fdc-4e6c-be4a-709233aa9e71"). InnerVolumeSpecName "kube-api-access-d7cm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.824396 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6254d135-7fdc-4e6c-be4a-709233aa9e71" (UID: "6254d135-7fdc-4e6c-be4a-709233aa9e71"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.857933 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config" (OuterVolumeSpecName: "config") pod "6254d135-7fdc-4e6c-be4a-709233aa9e71" (UID: "6254d135-7fdc-4e6c-be4a-709233aa9e71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.858700 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6254d135-7fdc-4e6c-be4a-709233aa9e71" (UID: "6254d135-7fdc-4e6c-be4a-709233aa9e71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.879807 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6254d135-7fdc-4e6c-be4a-709233aa9e71" (UID: "6254d135-7fdc-4e6c-be4a-709233aa9e71"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.919571 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.919658 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.919668 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.919679 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7cm9\" (UniqueName: \"kubernetes.io/projected/6254d135-7fdc-4e6c-be4a-709233aa9e71-kube-api-access-d7cm9\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:12 crc kubenswrapper[4747]: I1205 22:15:12.919687 4747 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6254d135-7fdc-4e6c-be4a-709233aa9e71-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.131113 4747 generic.go:334] "Generic (PLEG): container finished" podID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerID="7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b" exitCode=0 Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.131190 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b4c965b-fzknc" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.131172 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerDied","Data":"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b"} Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.131643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b4c965b-fzknc" event={"ID":"6254d135-7fdc-4e6c-be4a-709233aa9e71","Type":"ContainerDied","Data":"74bb99f62262e939cdd4d57f0b254d7d5b7739d77733370140c3b98ac55875ea"} Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.131775 4747 scope.go:117] "RemoveContainer" containerID="142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.160709 4747 scope.go:117] "RemoveContainer" containerID="7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.175507 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.185778 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c6b4c965b-fzknc"] Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.188282 4747 scope.go:117] "RemoveContainer" containerID="142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59" Dec 05 22:15:13 crc kubenswrapper[4747]: E1205 22:15:13.188677 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59\": container with ID starting with 142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59 not found: ID does not exist" containerID="142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.188725 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59"} err="failed to get container status \"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59\": rpc error: code = NotFound desc = could not find container \"142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59\": container with ID starting with 142394c6f7f7d1b5b33282b01b7dd1445fad16232231e984cbdd62cff7d17d59 not found: ID does not exist" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.188752 4747 scope.go:117] "RemoveContainer" containerID="7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b" Dec 05 22:15:13 crc kubenswrapper[4747]: E1205 22:15:13.189119 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b\": container with ID starting with 7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b not found: ID does not exist" containerID="7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.189141 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b"} err="failed to get container status \"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b\": rpc error: code = NotFound desc = could not find container \"7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b\": container with ID starting with 7a567a9b976a72bf3981af9a001139d2679d264e22b8a13437476ff725aa3e3b not found: ID does not exist" Dec 05 22:15:13 crc kubenswrapper[4747]: I1205 22:15:13.862060 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" path="/var/lib/kubelet/pods/6254d135-7fdc-4e6c-be4a-709233aa9e71/volumes" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.461915 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.462619 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.507017 4747 patch_prober.go:28] interesting pod/router-default-5444994796-2p8p6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.507067 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-2p8p6" podUID="facd7fa0-d8a3-4fd4-a8c9-64578d2779aa" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.812411 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="8b0a2255-16ff-4be0-a16e-161076f32fc4" containerName="galera" probeResult="failure" output="command timed out" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889195 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ctndg"] Dec 05 22:15:25 crc kubenswrapper[4747]: E1205 22:15:25.889635 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-httpd" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889692 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-httpd" Dec 05 22:15:25 crc kubenswrapper[4747]: E1205 22:15:25.889712 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e852ab93-76f8-4d63-83fd-527104dd25df" containerName="collect-profiles" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889723 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e852ab93-76f8-4d63-83fd-527104dd25df" containerName="collect-profiles" Dec 05 22:15:25 crc kubenswrapper[4747]: E1205 22:15:25.889764 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-api" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889772 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-api" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889955 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-httpd" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889979 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e852ab93-76f8-4d63-83fd-527104dd25df" containerName="collect-profiles" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.889993 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6254d135-7fdc-4e6c-be4a-709233aa9e71" containerName="neutron-api" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.890756 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.898398 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hh79w" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.910284 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.910518 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.910633 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.910689 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.923687 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ctndg"] Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940524 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q84\" (UniqueName: \"kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940580 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940688 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940747 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.940777 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.972378 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:15:25 crc kubenswrapper[4747]: I1205 22:15:25.977561 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.002168 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.043568 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.043678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.043703 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.043823 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46nm\" (UniqueName: \"kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.043856 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044084 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044338 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044547 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q84\" (UniqueName: \"kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.045354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.046674 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.047791 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.044583 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.053098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.070934 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.106308 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q84\" (UniqueName: \"kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.116865 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf\") pod \"swift-ring-rebalance-ctndg\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.153133 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.153197 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.153218 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.153321 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.153375 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46nm\" (UniqueName: \"kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.154558 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.155126 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.155189 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.155691 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.175927 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46nm\" (UniqueName: \"kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm\") pod \"dnsmasq-dns-5ccc65b979-wq4pz\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.277455 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.306120 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.783443 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ctndg"] Dec 05 22:15:26 crc kubenswrapper[4747]: I1205 22:15:26.893432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:15:26 crc kubenswrapper[4747]: W1205 22:15:26.899271 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c9a19f_48b0_49eb_8bc4_2b662a021ed8.slice/crio-429dc3179cd288aeea9ecf648fd03410d7f0abe5ae9ac1244d1c5e0d988365d9 WatchSource:0}: Error finding container 429dc3179cd288aeea9ecf648fd03410d7f0abe5ae9ac1244d1c5e0d988365d9: Status 404 returned error can't find the container with id 429dc3179cd288aeea9ecf648fd03410d7f0abe5ae9ac1244d1c5e0d988365d9 Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.468365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ctndg" event={"ID":"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b","Type":"ContainerStarted","Data":"494eb922188241701c0584b17f78411c30bfcb9103ff6bc5d0349f50ae365b4c"} Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.468813 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ctndg" event={"ID":"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b","Type":"ContainerStarted","Data":"4d6c0c65398c7de1e0911a2ac6f0e98af359a78a19466ce94037a36d63a61f6e"} Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.470728 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerID="8709f53a8e55070c4068fe5d584c1e9d0f161a9a20198f7c2bbd7662dd2cb7ef" exitCode=0 Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.470869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" event={"ID":"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8","Type":"ContainerDied","Data":"8709f53a8e55070c4068fe5d584c1e9d0f161a9a20198f7c2bbd7662dd2cb7ef"} Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.470955 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" event={"ID":"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8","Type":"ContainerStarted","Data":"429dc3179cd288aeea9ecf648fd03410d7f0abe5ae9ac1244d1c5e0d988365d9"} Dec 05 22:15:27 crc kubenswrapper[4747]: I1205 22:15:27.520512 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-ctndg" podStartSLOduration=2.520488896 podStartE2EDuration="2.520488896s" podCreationTimestamp="2025-12-05 22:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:15:27.489098728 +0000 UTC m=+5597.956406226" watchObservedRunningTime="2025-12-05 22:15:27.520488896 +0000 UTC m=+5597.987796384" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.105486 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.107197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.109297 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.114997 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187171 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187471 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187495 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.187556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncc9c\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318335 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318382 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318427 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncc9c\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318461 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318581 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318856 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.318886 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.325616 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.325747 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.326115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.341780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncc9c\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c\") pod \"swift-proxy-ff9f848f4-27v79\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.426841 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.495557 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" event={"ID":"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8","Type":"ContainerStarted","Data":"2d567a961a38931566853334d0a484befbc1e29a6f9c21fd786881c4a2661fa7"} Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.495796 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:28 crc kubenswrapper[4747]: I1205 22:15:28.524124 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" podStartSLOduration=3.524106809 podStartE2EDuration="3.524106809s" podCreationTimestamp="2025-12-05 22:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:15:28.522614722 +0000 UTC m=+5598.989922210" watchObservedRunningTime="2025-12-05 22:15:28.524106809 +0000 UTC m=+5598.991414287" Dec 05 22:15:29 crc kubenswrapper[4747]: I1205 22:15:29.079809 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:29 crc kubenswrapper[4747]: W1205 22:15:29.080058 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a82fc9a_86ce_49b7_880d_a8418fb38073.slice/crio-66178be02bb978833db6e28db0a965463c50803bce3936ffadaf8d4f55d8b83d WatchSource:0}: Error finding container 66178be02bb978833db6e28db0a965463c50803bce3936ffadaf8d4f55d8b83d: Status 404 returned error can't find the container with id 66178be02bb978833db6e28db0a965463c50803bce3936ffadaf8d4f55d8b83d Dec 05 22:15:29 crc kubenswrapper[4747]: I1205 22:15:29.509303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerStarted","Data":"6f75a61b7a8533896d18a6b224f925dfb5664c74feb9cba8742d4b509af00ca6"} Dec 05 22:15:29 crc kubenswrapper[4747]: I1205 22:15:29.510020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerStarted","Data":"66178be02bb978833db6e28db0a965463c50803bce3936ffadaf8d4f55d8b83d"} Dec 05 22:15:29 crc kubenswrapper[4747]: I1205 22:15:29.583538 4747 scope.go:117] "RemoveContainer" containerID="bcf788960eccd1b080ae22e4eddcc2cd0d694c3437dd04006cecefc63acf53e6" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.465838 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f55995b6-9n2p4"] Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.467922 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.472562 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.473205 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.482038 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f55995b6-9n2p4"] Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.522647 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerStarted","Data":"de2b5c31e1ee0cfa87a3b084b5f8d0031c75d8efb9c25d9cd7269a593240b0fe"} Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.524105 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.524153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.550264 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-ff9f848f4-27v79" podStartSLOduration=2.550220955 podStartE2EDuration="2.550220955s" podCreationTimestamp="2025-12-05 22:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:15:30.547622431 +0000 UTC m=+5601.014929939" watchObservedRunningTime="2025-12-05 22:15:30.550220955 +0000 UTC m=+5601.017528443" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663061 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-public-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663157 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-internal-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663180 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-combined-ca-bundle\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663215 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6rk\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-kube-api-access-rr6rk\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-log-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663289 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-config-data\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663338 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-etc-swift\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.663362 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-run-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765101 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-etc-swift\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765160 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-run-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765220 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-public-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765274 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-internal-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-combined-ca-bundle\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765314 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6rk\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-kube-api-access-rr6rk\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765339 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-log-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-config-data\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.765600 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-run-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.766021 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-log-httpd\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.771678 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-etc-swift\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.772016 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-combined-ca-bundle\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.772174 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-public-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.772756 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-config-data\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.773140 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-internal-tls-certs\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.785244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6rk\" (UniqueName: \"kubernetes.io/projected/ca9730ab-ce9c-4f56-a81e-14c78ac858ab-kube-api-access-rr6rk\") pod \"swift-proxy-6f55995b6-9n2p4\" (UID: \"ca9730ab-ce9c-4f56-a81e-14c78ac858ab\") " pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:30 crc kubenswrapper[4747]: I1205 22:15:30.844063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:31 crc kubenswrapper[4747]: I1205 22:15:31.442381 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f55995b6-9n2p4"] Dec 05 22:15:31 crc kubenswrapper[4747]: W1205 22:15:31.453449 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca9730ab_ce9c_4f56_a81e_14c78ac858ab.slice/crio-53628d0e0cd080f139f034d5d3822770eda63832473fc36e88d6d5018faf55a0 WatchSource:0}: Error finding container 53628d0e0cd080f139f034d5d3822770eda63832473fc36e88d6d5018faf55a0: Status 404 returned error can't find the container with id 53628d0e0cd080f139f034d5d3822770eda63832473fc36e88d6d5018faf55a0 Dec 05 22:15:31 crc kubenswrapper[4747]: I1205 22:15:31.543748 4747 generic.go:334] "Generic (PLEG): container finished" podID="d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" containerID="494eb922188241701c0584b17f78411c30bfcb9103ff6bc5d0349f50ae365b4c" exitCode=0 Dec 05 22:15:31 crc kubenswrapper[4747]: I1205 22:15:31.543849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ctndg" event={"ID":"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b","Type":"ContainerDied","Data":"494eb922188241701c0584b17f78411c30bfcb9103ff6bc5d0349f50ae365b4c"} Dec 05 22:15:31 crc kubenswrapper[4747]: I1205 22:15:31.546321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f55995b6-9n2p4" event={"ID":"ca9730ab-ce9c-4f56-a81e-14c78ac858ab","Type":"ContainerStarted","Data":"53628d0e0cd080f139f034d5d3822770eda63832473fc36e88d6d5018faf55a0"} Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.558652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f55995b6-9n2p4" event={"ID":"ca9730ab-ce9c-4f56-a81e-14c78ac858ab","Type":"ContainerStarted","Data":"18111c435f4610ca69c5624f7d19e0bf97ec1add5256ee273ff7ea657c93d86a"} Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.558994 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f55995b6-9n2p4" event={"ID":"ca9730ab-ce9c-4f56-a81e-14c78ac858ab","Type":"ContainerStarted","Data":"cd04d9c7f2dd47e5b743856ddd5deb9b0a0a6150f8b8e55323eb7fe4ad7f752d"} Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.559424 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.559443 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.597440 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f55995b6-9n2p4" podStartSLOduration=2.597407923 podStartE2EDuration="2.597407923s" podCreationTimestamp="2025-12-05 22:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:15:32.590506422 +0000 UTC m=+5603.057813920" watchObservedRunningTime="2025-12-05 22:15:32.597407923 +0000 UTC m=+5603.064715421" Dec 05 22:15:32 crc kubenswrapper[4747]: I1205 22:15:32.917689 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005436 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005546 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005673 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005702 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005751 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005772 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.005797 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9q84\" (UniqueName: \"kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84\") pod \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\" (UID: \"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b\") " Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.010548 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.011549 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.033570 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84" (OuterVolumeSpecName: "kube-api-access-s9q84") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "kube-api-access-s9q84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.054872 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.094272 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts" (OuterVolumeSpecName: "scripts") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.110720 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.110751 4747 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.110760 4747 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.110770 4747 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.110779 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9q84\" (UniqueName: \"kubernetes.io/projected/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-kube-api-access-s9q84\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.122713 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.131412 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" (UID: "d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.212212 4747 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.212246 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.565467 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ctndg" event={"ID":"d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b","Type":"ContainerDied","Data":"4d6c0c65398c7de1e0911a2ac6f0e98af359a78a19466ce94037a36d63a61f6e"} Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.565493 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ctndg" Dec 05 22:15:33 crc kubenswrapper[4747]: I1205 22:15:33.565507 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6c0c65398c7de1e0911a2ac6f0e98af359a78a19466ce94037a36d63a61f6e" Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.308712 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.394930 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.395282 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="dnsmasq-dns" containerID="cri-o://6abb455833bd4eea7b3d65dcec40a2c0f876caacbc21d99dbe49f17ffb2e2921" gracePeriod=10 Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.601787 4747 generic.go:334] "Generic (PLEG): container finished" podID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerID="6abb455833bd4eea7b3d65dcec40a2c0f876caacbc21d99dbe49f17ffb2e2921" exitCode=0 Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.601871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" event={"ID":"b15d3c51-cae8-4777-b7b9-5bc60f05464c","Type":"ContainerDied","Data":"6abb455833bd4eea7b3d65dcec40a2c0f876caacbc21d99dbe49f17ffb2e2921"} Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.860078 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.988550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb\") pod \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.988767 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb\") pod \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.988806 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwcft\" (UniqueName: \"kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft\") pod \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.988836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc\") pod \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " Dec 05 22:15:36 crc kubenswrapper[4747]: I1205 22:15:36.988899 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config\") pod \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\" (UID: \"b15d3c51-cae8-4777-b7b9-5bc60f05464c\") " Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.014294 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft" (OuterVolumeSpecName: "kube-api-access-xwcft") pod "b15d3c51-cae8-4777-b7b9-5bc60f05464c" (UID: "b15d3c51-cae8-4777-b7b9-5bc60f05464c"). InnerVolumeSpecName "kube-api-access-xwcft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.035796 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b15d3c51-cae8-4777-b7b9-5bc60f05464c" (UID: "b15d3c51-cae8-4777-b7b9-5bc60f05464c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.039390 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config" (OuterVolumeSpecName: "config") pod "b15d3c51-cae8-4777-b7b9-5bc60f05464c" (UID: "b15d3c51-cae8-4777-b7b9-5bc60f05464c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.048139 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b15d3c51-cae8-4777-b7b9-5bc60f05464c" (UID: "b15d3c51-cae8-4777-b7b9-5bc60f05464c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.061778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b15d3c51-cae8-4777-b7b9-5bc60f05464c" (UID: "b15d3c51-cae8-4777-b7b9-5bc60f05464c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.091371 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.091404 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwcft\" (UniqueName: \"kubernetes.io/projected/b15d3c51-cae8-4777-b7b9-5bc60f05464c-kube-api-access-xwcft\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.091416 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.091424 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.091432 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15d3c51-cae8-4777-b7b9-5bc60f05464c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.616068 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" event={"ID":"b15d3c51-cae8-4777-b7b9-5bc60f05464c","Type":"ContainerDied","Data":"4247cba22a8d54a21f5f9e05d706fceae1b13a5a5a2f5baeb48bb309f555bacb"} Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.616158 4747 scope.go:117] "RemoveContainer" containerID="6abb455833bd4eea7b3d65dcec40a2c0f876caacbc21d99dbe49f17ffb2e2921" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.616170 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b95ffff5-lb4gm" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.661805 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.672651 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b95ffff5-lb4gm"] Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.674704 4747 scope.go:117] "RemoveContainer" containerID="7b9f54ed1200f635e86f9401470c6725b9b9f92e2ca4bcb7fa63b08e31476cd2" Dec 05 22:15:37 crc kubenswrapper[4747]: I1205 22:15:37.853128 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" path="/var/lib/kubelet/pods/b15d3c51-cae8-4777-b7b9-5bc60f05464c/volumes" Dec 05 22:15:38 crc kubenswrapper[4747]: I1205 22:15:38.431737 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:38 crc kubenswrapper[4747]: I1205 22:15:38.432658 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:40 crc kubenswrapper[4747]: I1205 22:15:40.852503 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:40 crc kubenswrapper[4747]: I1205 22:15:40.855318 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f55995b6-9n2p4" Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.028750 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.029315 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-ff9f848f4-27v79" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-httpd" containerID="cri-o://6f75a61b7a8533896d18a6b224f925dfb5664c74feb9cba8742d4b509af00ca6" gracePeriod=30 Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.029868 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-ff9f848f4-27v79" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-server" containerID="cri-o://de2b5c31e1ee0cfa87a3b084b5f8d0031c75d8efb9c25d9cd7269a593240b0fe" gracePeriod=30 Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.684719 4747 generic.go:334] "Generic (PLEG): container finished" podID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerID="de2b5c31e1ee0cfa87a3b084b5f8d0031c75d8efb9c25d9cd7269a593240b0fe" exitCode=0 Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.685105 4747 generic.go:334] "Generic (PLEG): container finished" podID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerID="6f75a61b7a8533896d18a6b224f925dfb5664c74feb9cba8742d4b509af00ca6" exitCode=0 Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.684780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerDied","Data":"de2b5c31e1ee0cfa87a3b084b5f8d0031c75d8efb9c25d9cd7269a593240b0fe"} Dec 05 22:15:41 crc kubenswrapper[4747]: I1205 22:15:41.685308 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerDied","Data":"6f75a61b7a8533896d18a6b224f925dfb5664c74feb9cba8742d4b509af00ca6"} Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.059623 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.196973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncc9c\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197154 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197187 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197264 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197538 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197611 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.197696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.198071 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle\") pod \"7a82fc9a-86ce-49b7-880d-a8418fb38073\" (UID: \"7a82fc9a-86ce-49b7-880d-a8418fb38073\") " Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.198534 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.198556 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a82fc9a-86ce-49b7-880d-a8418fb38073-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.203740 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.212930 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c" (OuterVolumeSpecName: "kube-api-access-ncc9c") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "kube-api-access-ncc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.244474 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data" (OuterVolumeSpecName: "config-data") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.257805 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a82fc9a-86ce-49b7-880d-a8418fb38073" (UID: "7a82fc9a-86ce-49b7-880d-a8418fb38073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.300435 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.300852 4747 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.300868 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a82fc9a-86ce-49b7-880d-a8418fb38073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.300883 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncc9c\" (UniqueName: \"kubernetes.io/projected/7a82fc9a-86ce-49b7-880d-a8418fb38073-kube-api-access-ncc9c\") on node \"crc\" DevicePath \"\"" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.696241 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff9f848f4-27v79" event={"ID":"7a82fc9a-86ce-49b7-880d-a8418fb38073","Type":"ContainerDied","Data":"66178be02bb978833db6e28db0a965463c50803bce3936ffadaf8d4f55d8b83d"} Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.696287 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff9f848f4-27v79" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.696312 4747 scope.go:117] "RemoveContainer" containerID="de2b5c31e1ee0cfa87a3b084b5f8d0031c75d8efb9c25d9cd7269a593240b0fe" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.721979 4747 scope.go:117] "RemoveContainer" containerID="6f75a61b7a8533896d18a6b224f925dfb5664c74feb9cba8742d4b509af00ca6" Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.739040 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:42 crc kubenswrapper[4747]: I1205 22:15:42.750082 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-ff9f848f4-27v79"] Dec 05 22:15:43 crc kubenswrapper[4747]: I1205 22:15:43.854133 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" path="/var/lib/kubelet/pods/7a82fc9a-86ce-49b7-880d-a8418fb38073/volumes" Dec 05 22:16:06 crc kubenswrapper[4747]: I1205 22:16:06.222160 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:16:06 crc kubenswrapper[4747]: I1205 22:16:06.223067 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.093105 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:13 crc kubenswrapper[4747]: E1205 22:16:13.094086 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="init" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.094102 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="init" Dec 05 22:16:13 crc kubenswrapper[4747]: E1205 22:16:13.094118 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" containerName="swift-ring-rebalance" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.094128 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" containerName="swift-ring-rebalance" Dec 05 22:16:13 crc kubenswrapper[4747]: E1205 22:16:13.094145 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="dnsmasq-dns" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.094154 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="dnsmasq-dns" Dec 05 22:16:13 crc kubenswrapper[4747]: E1205 22:16:13.094174 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-server" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.094207 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-server" Dec 05 22:16:13 crc kubenswrapper[4747]: E1205 22:16:13.094224 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-httpd" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.094232 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-httpd" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.095645 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b" containerName="swift-ring-rebalance" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.095693 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-httpd" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.095706 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15d3c51-cae8-4777-b7b9-5bc60f05464c" containerName="dnsmasq-dns" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.095723 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a82fc9a-86ce-49b7-880d-a8418fb38073" containerName="proxy-server" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.097265 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.129065 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.240833 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvsf\" (UniqueName: \"kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.240995 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.241046 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.342987 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.343332 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.343507 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.343674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drvsf\" (UniqueName: \"kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.343770 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.381943 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvsf\" (UniqueName: \"kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf\") pod \"certified-operators-hz47b\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.438975 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.757819 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-25clt"] Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.769017 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.821361 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-25clt"] Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.863665 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ea45-account-create-update-bbll6"] Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.864962 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea45-account-create-update-bbll6"] Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.864959 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.865054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.865285 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbbf\" (UniqueName: \"kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.885261 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.936408 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:13 crc kubenswrapper[4747]: W1205 22:16:13.940026 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0959e0ce_cd05_437a_a4b0_41606a7ab425.slice/crio-18631978c77a9562aec4f7f76e49b4e7a9bc9a79cff913cfc167a3014ab47008 WatchSource:0}: Error finding container 18631978c77a9562aec4f7f76e49b4e7a9bc9a79cff913cfc167a3014ab47008: Status 404 returned error can't find the container with id 18631978c77a9562aec4f7f76e49b4e7a9bc9a79cff913cfc167a3014ab47008 Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.969119 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthfx\" (UniqueName: \"kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.969183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.969260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.969352 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbbf\" (UniqueName: \"kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.971079 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:13 crc kubenswrapper[4747]: I1205 22:16:13.991037 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbbf\" (UniqueName: \"kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf\") pod \"cinder-db-create-25clt\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " pod="openstack/cinder-db-create-25clt" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.057240 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerStarted","Data":"18631978c77a9562aec4f7f76e49b4e7a9bc9a79cff913cfc167a3014ab47008"} Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.071517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nthfx\" (UniqueName: \"kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.071554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.072243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.094022 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthfx\" (UniqueName: \"kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx\") pod \"cinder-ea45-account-create-update-bbll6\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.102380 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-25clt" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.207487 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.604744 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-25clt"] Dec 05 22:16:14 crc kubenswrapper[4747]: W1205 22:16:14.613120 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614a026c_be03_4120_9d35_b2f8e097a0ae.slice/crio-71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510 WatchSource:0}: Error finding container 71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510: Status 404 returned error can't find the container with id 71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510 Dec 05 22:16:14 crc kubenswrapper[4747]: I1205 22:16:14.718291 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea45-account-create-update-bbll6"] Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.067911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-25clt" event={"ID":"614a026c-be03-4120-9d35-b2f8e097a0ae","Type":"ContainerStarted","Data":"2568597de6c1c8f1d8cd23730cfa1cb1ecc2f87dd71da4df836ede7f8cd035ff"} Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.067959 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-25clt" event={"ID":"614a026c-be03-4120-9d35-b2f8e097a0ae","Type":"ContainerStarted","Data":"71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510"} Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.069930 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea45-account-create-update-bbll6" event={"ID":"b6e06227-0e9f-46ce-92dc-2b780fcb561f","Type":"ContainerStarted","Data":"f2be96d521d4474b73bc2ef9d96baa4b8248491bf39497017408c7a4715fd610"} Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.069973 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea45-account-create-update-bbll6" event={"ID":"b6e06227-0e9f-46ce-92dc-2b780fcb561f","Type":"ContainerStarted","Data":"ab423a3fc53002040c92f8e22b194846e9cf43a6032bbb9e952ad6766f6dca8d"} Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.072122 4747 generic.go:334] "Generic (PLEG): container finished" podID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerID="6c6f2fcdf430972f73ffee50b0a993ec932b8237b5752f22ae6e9310297253f2" exitCode=0 Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.072172 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerDied","Data":"6c6f2fcdf430972f73ffee50b0a993ec932b8237b5752f22ae6e9310297253f2"} Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.074680 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.091001 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-25clt" podStartSLOduration=2.090983919 podStartE2EDuration="2.090983919s" podCreationTimestamp="2025-12-05 22:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:15.090571799 +0000 UTC m=+5645.557879317" watchObservedRunningTime="2025-12-05 22:16:15.090983919 +0000 UTC m=+5645.558291407" Dec 05 22:16:15 crc kubenswrapper[4747]: I1205 22:16:15.122446 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ea45-account-create-update-bbll6" podStartSLOduration=2.122417048 podStartE2EDuration="2.122417048s" podCreationTimestamp="2025-12-05 22:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:15.107467507 +0000 UTC m=+5645.574775035" watchObservedRunningTime="2025-12-05 22:16:15.122417048 +0000 UTC m=+5645.589724576" Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.085455 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerStarted","Data":"0c67b33c91891419e9e37d2d0954e128a80a762f352db4a3a1083ffbf5d884cc"} Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.087226 4747 generic.go:334] "Generic (PLEG): container finished" podID="614a026c-be03-4120-9d35-b2f8e097a0ae" containerID="2568597de6c1c8f1d8cd23730cfa1cb1ecc2f87dd71da4df836ede7f8cd035ff" exitCode=0 Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.087353 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-25clt" event={"ID":"614a026c-be03-4120-9d35-b2f8e097a0ae","Type":"ContainerDied","Data":"2568597de6c1c8f1d8cd23730cfa1cb1ecc2f87dd71da4df836ede7f8cd035ff"} Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.090019 4747 generic.go:334] "Generic (PLEG): container finished" podID="b6e06227-0e9f-46ce-92dc-2b780fcb561f" containerID="f2be96d521d4474b73bc2ef9d96baa4b8248491bf39497017408c7a4715fd610" exitCode=0 Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.090071 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea45-account-create-update-bbll6" event={"ID":"b6e06227-0e9f-46ce-92dc-2b780fcb561f","Type":"ContainerDied","Data":"f2be96d521d4474b73bc2ef9d96baa4b8248491bf39497017408c7a4715fd610"} Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.859676 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.862518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:16 crc kubenswrapper[4747]: I1205 22:16:16.916496 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.041456 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.042105 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkndr\" (UniqueName: \"kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.042239 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.100757 4747 generic.go:334] "Generic (PLEG): container finished" podID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerID="0c67b33c91891419e9e37d2d0954e128a80a762f352db4a3a1083ffbf5d884cc" exitCode=0 Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.100889 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerDied","Data":"0c67b33c91891419e9e37d2d0954e128a80a762f352db4a3a1083ffbf5d884cc"} Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.144161 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.144229 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkndr\" (UniqueName: \"kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.144268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.144895 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.145354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.171502 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkndr\" (UniqueName: \"kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr\") pod \"redhat-operators-x6dtk\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.204698 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.519249 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.525405 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-25clt" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.657432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts\") pod \"614a026c-be03-4120-9d35-b2f8e097a0ae\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.657467 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts\") pod \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.657501 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbbbf\" (UniqueName: \"kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf\") pod \"614a026c-be03-4120-9d35-b2f8e097a0ae\" (UID: \"614a026c-be03-4120-9d35-b2f8e097a0ae\") " Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.657552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nthfx\" (UniqueName: \"kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx\") pod \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\" (UID: \"b6e06227-0e9f-46ce-92dc-2b780fcb561f\") " Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.658901 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "614a026c-be03-4120-9d35-b2f8e097a0ae" (UID: "614a026c-be03-4120-9d35-b2f8e097a0ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.659074 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6e06227-0e9f-46ce-92dc-2b780fcb561f" (UID: "b6e06227-0e9f-46ce-92dc-2b780fcb561f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.662476 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf" (OuterVolumeSpecName: "kube-api-access-pbbbf") pod "614a026c-be03-4120-9d35-b2f8e097a0ae" (UID: "614a026c-be03-4120-9d35-b2f8e097a0ae"). InnerVolumeSpecName "kube-api-access-pbbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.663717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx" (OuterVolumeSpecName: "kube-api-access-nthfx") pod "b6e06227-0e9f-46ce-92dc-2b780fcb561f" (UID: "b6e06227-0e9f-46ce-92dc-2b780fcb561f"). InnerVolumeSpecName "kube-api-access-nthfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.759789 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbbbf\" (UniqueName: \"kubernetes.io/projected/614a026c-be03-4120-9d35-b2f8e097a0ae-kube-api-access-pbbbf\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.759825 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nthfx\" (UniqueName: \"kubernetes.io/projected/b6e06227-0e9f-46ce-92dc-2b780fcb561f-kube-api-access-nthfx\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.759838 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614a026c-be03-4120-9d35-b2f8e097a0ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.759850 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e06227-0e9f-46ce-92dc-2b780fcb561f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:17 crc kubenswrapper[4747]: I1205 22:16:17.800873 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:17 crc kubenswrapper[4747]: W1205 22:16:17.808812 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f182d6d_87a9_4787_8af1_b4e5a6efc2ed.slice/crio-347153088efcc97972b1b1fe071a401265e53b6241a56c829961b0c4cc7794ea WatchSource:0}: Error finding container 347153088efcc97972b1b1fe071a401265e53b6241a56c829961b0c4cc7794ea: Status 404 returned error can't find the container with id 347153088efcc97972b1b1fe071a401265e53b6241a56c829961b0c4cc7794ea Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.109241 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea45-account-create-update-bbll6" event={"ID":"b6e06227-0e9f-46ce-92dc-2b780fcb561f","Type":"ContainerDied","Data":"ab423a3fc53002040c92f8e22b194846e9cf43a6032bbb9e952ad6766f6dca8d"} Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.109280 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab423a3fc53002040c92f8e22b194846e9cf43a6032bbb9e952ad6766f6dca8d" Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.109292 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea45-account-create-update-bbll6" Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.111310 4747 generic.go:334] "Generic (PLEG): container finished" podID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerID="6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9" exitCode=0 Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.111390 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerDied","Data":"6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9"} Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.111421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerStarted","Data":"347153088efcc97972b1b1fe071a401265e53b6241a56c829961b0c4cc7794ea"} Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.114784 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerStarted","Data":"a9fd54f026362731b4c467a2d80329295b3b057058310d8a03a10df125ae5bfb"} Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.116391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-25clt" event={"ID":"614a026c-be03-4120-9d35-b2f8e097a0ae","Type":"ContainerDied","Data":"71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510"} Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.116420 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f6235894ecbec38f63a1259d8223add49084af970f5830ee97b24ba34f6510" Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.116472 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-25clt" Dec 05 22:16:18 crc kubenswrapper[4747]: I1205 22:16:18.172233 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hz47b" podStartSLOduration=2.458498288 podStartE2EDuration="5.172216326s" podCreationTimestamp="2025-12-05 22:16:13 +0000 UTC" firstStartedPulling="2025-12-05 22:16:15.074181592 +0000 UTC m=+5645.541489120" lastFinishedPulling="2025-12-05 22:16:17.78789966 +0000 UTC m=+5648.255207158" observedRunningTime="2025-12-05 22:16:18.165207422 +0000 UTC m=+5648.632514910" watchObservedRunningTime="2025-12-05 22:16:18.172216326 +0000 UTC m=+5648.639523814" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.128239 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerStarted","Data":"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb"} Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.159686 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tnc6l"] Dec 05 22:16:19 crc kubenswrapper[4747]: E1205 22:16:19.160156 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e06227-0e9f-46ce-92dc-2b780fcb561f" containerName="mariadb-account-create-update" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.160181 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e06227-0e9f-46ce-92dc-2b780fcb561f" containerName="mariadb-account-create-update" Dec 05 22:16:19 crc kubenswrapper[4747]: E1205 22:16:19.160196 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614a026c-be03-4120-9d35-b2f8e097a0ae" containerName="mariadb-database-create" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.160205 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="614a026c-be03-4120-9d35-b2f8e097a0ae" containerName="mariadb-database-create" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.160398 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="614a026c-be03-4120-9d35-b2f8e097a0ae" containerName="mariadb-database-create" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.160427 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e06227-0e9f-46ce-92dc-2b780fcb561f" containerName="mariadb-account-create-update" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.161111 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.163919 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.164801 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.165036 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wz7bv" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.190354 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tnc6l"] Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.285462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.285569 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.285856 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.286176 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.286234 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vs8\" (UniqueName: \"kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.286298 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388501 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388622 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388673 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388751 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vs8\" (UniqueName: \"kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.388801 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.390479 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.396617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.403742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.405403 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.408688 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.415059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vs8\" (UniqueName: \"kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8\") pod \"cinder-db-sync-tnc6l\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.481006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:19 crc kubenswrapper[4747]: I1205 22:16:19.972659 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tnc6l"] Dec 05 22:16:20 crc kubenswrapper[4747]: I1205 22:16:20.139453 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnc6l" event={"ID":"18e1af63-bbc5-4585-a776-d82e96aeeeb3","Type":"ContainerStarted","Data":"abf7e14cf24aa8e73e945b42ba622714a74a3088126c4308a54b5ff85a8caec3"} Dec 05 22:16:20 crc kubenswrapper[4747]: I1205 22:16:20.141886 4747 generic.go:334] "Generic (PLEG): container finished" podID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerID="144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb" exitCode=0 Dec 05 22:16:20 crc kubenswrapper[4747]: I1205 22:16:20.141951 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerDied","Data":"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb"} Dec 05 22:16:21 crc kubenswrapper[4747]: I1205 22:16:21.150898 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerStarted","Data":"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca"} Dec 05 22:16:21 crc kubenswrapper[4747]: I1205 22:16:21.155076 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnc6l" event={"ID":"18e1af63-bbc5-4585-a776-d82e96aeeeb3","Type":"ContainerStarted","Data":"14ac20e366a5fb83c294d10c95bfbc68cd3d0dd1164f5f8755ababbf12c25114"} Dec 05 22:16:21 crc kubenswrapper[4747]: I1205 22:16:21.173900 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6dtk" podStartSLOduration=2.726511893 podStartE2EDuration="5.173871799s" podCreationTimestamp="2025-12-05 22:16:16 +0000 UTC" firstStartedPulling="2025-12-05 22:16:18.113462759 +0000 UTC m=+5648.580770247" lastFinishedPulling="2025-12-05 22:16:20.560822665 +0000 UTC m=+5651.028130153" observedRunningTime="2025-12-05 22:16:21.172329911 +0000 UTC m=+5651.639637449" watchObservedRunningTime="2025-12-05 22:16:21.173871799 +0000 UTC m=+5651.641179327" Dec 05 22:16:21 crc kubenswrapper[4747]: I1205 22:16:21.205305 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tnc6l" podStartSLOduration=2.205283157 podStartE2EDuration="2.205283157s" podCreationTimestamp="2025-12-05 22:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:21.187772713 +0000 UTC m=+5651.655080211" watchObservedRunningTime="2025-12-05 22:16:21.205283157 +0000 UTC m=+5651.672590655" Dec 05 22:16:23 crc kubenswrapper[4747]: I1205 22:16:23.439697 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:23 crc kubenswrapper[4747]: I1205 22:16:23.440123 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:23 crc kubenswrapper[4747]: I1205 22:16:23.510564 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:24 crc kubenswrapper[4747]: I1205 22:16:24.197548 4747 generic.go:334] "Generic (PLEG): container finished" podID="18e1af63-bbc5-4585-a776-d82e96aeeeb3" containerID="14ac20e366a5fb83c294d10c95bfbc68cd3d0dd1164f5f8755ababbf12c25114" exitCode=0 Dec 05 22:16:24 crc kubenswrapper[4747]: I1205 22:16:24.197692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnc6l" event={"ID":"18e1af63-bbc5-4585-a776-d82e96aeeeb3","Type":"ContainerDied","Data":"14ac20e366a5fb83c294d10c95bfbc68cd3d0dd1164f5f8755ababbf12c25114"} Dec 05 22:16:24 crc kubenswrapper[4747]: I1205 22:16:24.259933 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:24 crc kubenswrapper[4747]: I1205 22:16:24.639789 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.524992 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639231 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639285 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59vs8\" (UniqueName: \"kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639309 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639428 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639496 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639514 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle\") pod \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\" (UID: \"18e1af63-bbc5-4585-a776-d82e96aeeeb3\") " Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.639829 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.644710 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts" (OuterVolumeSpecName: "scripts") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.645003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.646406 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8" (OuterVolumeSpecName: "kube-api-access-59vs8") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "kube-api-access-59vs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.688170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.723359 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data" (OuterVolumeSpecName: "config-data") pod "18e1af63-bbc5-4585-a776-d82e96aeeeb3" (UID: "18e1af63-bbc5-4585-a776-d82e96aeeeb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743099 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18e1af63-bbc5-4585-a776-d82e96aeeeb3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743149 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743168 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743183 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743200 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59vs8\" (UniqueName: \"kubernetes.io/projected/18e1af63-bbc5-4585-a776-d82e96aeeeb3-kube-api-access-59vs8\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:25 crc kubenswrapper[4747]: I1205 22:16:25.743216 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18e1af63-bbc5-4585-a776-d82e96aeeeb3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.219807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnc6l" event={"ID":"18e1af63-bbc5-4585-a776-d82e96aeeeb3","Type":"ContainerDied","Data":"abf7e14cf24aa8e73e945b42ba622714a74a3088126c4308a54b5ff85a8caec3"} Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.219874 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf7e14cf24aa8e73e945b42ba622714a74a3088126c4308a54b5ff85a8caec3" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.219834 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnc6l" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.219930 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hz47b" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="registry-server" containerID="cri-o://a9fd54f026362731b4c467a2d80329295b3b057058310d8a03a10df125ae5bfb" gracePeriod=2 Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.615012 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:16:26 crc kubenswrapper[4747]: E1205 22:16:26.615342 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e1af63-bbc5-4585-a776-d82e96aeeeb3" containerName="cinder-db-sync" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.615353 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e1af63-bbc5-4585-a776-d82e96aeeeb3" containerName="cinder-db-sync" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.621500 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e1af63-bbc5-4585-a776-d82e96aeeeb3" containerName="cinder-db-sync" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.622484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.625414 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.727883 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.729633 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.732832 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.732947 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.733164 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.736078 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wz7bv" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.747524 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.770306 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.770373 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.770428 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.770521 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.770615 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccqd\" (UniqueName: \"kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccqd\" (UniqueName: \"kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873368 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873443 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873466 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873529 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dd4z\" (UniqueName: \"kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873607 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.873648 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.875283 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.875317 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.875956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.876102 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.898760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccqd\" (UniqueName: \"kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd\") pod \"dnsmasq-dns-6755cfdfb7-vn6cj\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.937698 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.975861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.975965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.976050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.976097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.976162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.976216 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dd4z\" (UniqueName: \"kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.976283 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.977836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.979157 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.983147 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.984617 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.985083 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:26 crc kubenswrapper[4747]: I1205 22:16:26.991082 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.004608 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dd4z\" (UniqueName: \"kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z\") pod \"cinder-api-0\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " pod="openstack/cinder-api-0" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.055643 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.205863 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.206129 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.243981 4747 generic.go:334] "Generic (PLEG): container finished" podID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerID="a9fd54f026362731b4c467a2d80329295b3b057058310d8a03a10df125ae5bfb" exitCode=0 Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.244022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerDied","Data":"a9fd54f026362731b4c467a2d80329295b3b057058310d8a03a10df125ae5bfb"} Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.266491 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.332265 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.520667 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.598011 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.843898 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.901170 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities\") pod \"0959e0ce-cd05-437a-a4b0-41606a7ab425\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.901240 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drvsf\" (UniqueName: \"kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf\") pod \"0959e0ce-cd05-437a-a4b0-41606a7ab425\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.901302 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content\") pod \"0959e0ce-cd05-437a-a4b0-41606a7ab425\" (UID: \"0959e0ce-cd05-437a-a4b0-41606a7ab425\") " Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.903160 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities" (OuterVolumeSpecName: "utilities") pod "0959e0ce-cd05-437a-a4b0-41606a7ab425" (UID: "0959e0ce-cd05-437a-a4b0-41606a7ab425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.909774 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf" (OuterVolumeSpecName: "kube-api-access-drvsf") pod "0959e0ce-cd05-437a-a4b0-41606a7ab425" (UID: "0959e0ce-cd05-437a-a4b0-41606a7ab425"). InnerVolumeSpecName "kube-api-access-drvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:27 crc kubenswrapper[4747]: I1205 22:16:27.958827 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0959e0ce-cd05-437a-a4b0-41606a7ab425" (UID: "0959e0ce-cd05-437a-a4b0-41606a7ab425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.004831 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.004874 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drvsf\" (UniqueName: \"kubernetes.io/projected/0959e0ce-cd05-437a-a4b0-41606a7ab425-kube-api-access-drvsf\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.004891 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0959e0ce-cd05-437a-a4b0-41606a7ab425-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.255892 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hz47b" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.255899 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hz47b" event={"ID":"0959e0ce-cd05-437a-a4b0-41606a7ab425","Type":"ContainerDied","Data":"18631978c77a9562aec4f7f76e49b4e7a9bc9a79cff913cfc167a3014ab47008"} Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.256492 4747 scope.go:117] "RemoveContainer" containerID="a9fd54f026362731b4c467a2d80329295b3b057058310d8a03a10df125ae5bfb" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.257068 4747 generic.go:334] "Generic (PLEG): container finished" podID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerID="118e61ce0a05d5548b10e6cf0f35230dc06e30c61496e00a546f3cd0f5078614" exitCode=0 Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.257124 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" event={"ID":"84801e85-4aa3-4695-9cae-c28fcd1211fe","Type":"ContainerDied","Data":"118e61ce0a05d5548b10e6cf0f35230dc06e30c61496e00a546f3cd0f5078614"} Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.257193 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" event={"ID":"84801e85-4aa3-4695-9cae-c28fcd1211fe","Type":"ContainerStarted","Data":"ba0870f350818fae72bc459b4d58f801c78161a278b6bb82b0851e9b4e1f7f57"} Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.259359 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerStarted","Data":"625ce199d1b820a84d47970e7e8f675f75db08bb7d99bc2fbb0f7d9dbfb52f4a"} Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.259393 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerStarted","Data":"cf086d60f8ac4393979d4c3cd36b4e0842feb4b2c532acad420105f0c19769b3"} Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.370805 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.378721 4747 scope.go:117] "RemoveContainer" containerID="0c67b33c91891419e9e37d2d0954e128a80a762f352db4a3a1083ffbf5d884cc" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.382776 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hz47b"] Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.417376 4747 scope.go:117] "RemoveContainer" containerID="6c6f2fcdf430972f73ffee50b0a993ec932b8237b5752f22ae6e9310297253f2" Dec 05 22:16:28 crc kubenswrapper[4747]: I1205 22:16:28.445752 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.040860 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.270974 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" event={"ID":"84801e85-4aa3-4695-9cae-c28fcd1211fe","Type":"ContainerStarted","Data":"25e64646a8fe3ced8eae324d583f33ec41233ec1baa6ebedae388b8200d49183"} Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.271119 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.274182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerStarted","Data":"d0cac9ff053f4f03667688683f1c1f1d1b1f69351ed830c29655ea7c530c377a"} Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.274344 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6dtk" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="registry-server" containerID="cri-o://8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca" gracePeriod=2 Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.274355 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.308080 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" podStartSLOduration=3.308058477 podStartE2EDuration="3.308058477s" podCreationTimestamp="2025-12-05 22:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:29.301044403 +0000 UTC m=+5659.768351911" watchObservedRunningTime="2025-12-05 22:16:29.308058477 +0000 UTC m=+5659.775365975" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.333542 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.333524658 podStartE2EDuration="3.333524658s" podCreationTimestamp="2025-12-05 22:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:29.321400268 +0000 UTC m=+5659.788707766" watchObservedRunningTime="2025-12-05 22:16:29.333524658 +0000 UTC m=+5659.800832136" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.646539 4747 scope.go:117] "RemoveContainer" containerID="d8acb77392364a05b83238efcea7f01c5f41fd9ab659abc531a47440e7e20951" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.666726 4747 scope.go:117] "RemoveContainer" containerID="eacb542e8f2034bdcddaec2929d18cd2cab1939319d1010cc83108837e6d726f" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.801403 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.846158 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities\") pod \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.846257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkndr\" (UniqueName: \"kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr\") pod \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.846356 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content\") pod \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\" (UID: \"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed\") " Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.847380 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities" (OuterVolumeSpecName: "utilities") pod "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" (UID: "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.853508 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr" (OuterVolumeSpecName: "kube-api-access-nkndr") pod "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" (UID: "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed"). InnerVolumeSpecName "kube-api-access-nkndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.855866 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" path="/var/lib/kubelet/pods/0959e0ce-cd05-437a-a4b0-41606a7ab425/volumes" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.949044 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:29 crc kubenswrapper[4747]: I1205 22:16:29.949082 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkndr\" (UniqueName: \"kubernetes.io/projected/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-kube-api-access-nkndr\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289062 4747 generic.go:334] "Generic (PLEG): container finished" podID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerID="8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca" exitCode=0 Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289159 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6dtk" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerDied","Data":"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca"} Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289281 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api-log" containerID="cri-o://625ce199d1b820a84d47970e7e8f675f75db08bb7d99bc2fbb0f7d9dbfb52f4a" gracePeriod=30 Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289370 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api" containerID="cri-o://d0cac9ff053f4f03667688683f1c1f1d1b1f69351ed830c29655ea7c530c377a" gracePeriod=30 Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6dtk" event={"ID":"4f182d6d-87a9-4787-8af1-b4e5a6efc2ed","Type":"ContainerDied","Data":"347153088efcc97972b1b1fe071a401265e53b6241a56c829961b0c4cc7794ea"} Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.289609 4747 scope.go:117] "RemoveContainer" containerID="8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.330201 4747 scope.go:117] "RemoveContainer" containerID="144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.366892 4747 scope.go:117] "RemoveContainer" containerID="6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.406293 4747 scope.go:117] "RemoveContainer" containerID="8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca" Dec 05 22:16:30 crc kubenswrapper[4747]: E1205 22:16:30.406745 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca\": container with ID starting with 8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca not found: ID does not exist" containerID="8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.406788 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca"} err="failed to get container status \"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca\": rpc error: code = NotFound desc = could not find container \"8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca\": container with ID starting with 8a1ecbb35c301b65826c8f788dcb08c96a26c7d35ff01049183ddb491abc5dca not found: ID does not exist" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.406814 4747 scope.go:117] "RemoveContainer" containerID="144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb" Dec 05 22:16:30 crc kubenswrapper[4747]: E1205 22:16:30.407301 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb\": container with ID starting with 144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb not found: ID does not exist" containerID="144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.407337 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb"} err="failed to get container status \"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb\": rpc error: code = NotFound desc = could not find container \"144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb\": container with ID starting with 144055ea0f6c405eb572615e7b9ac0f59b0f64aad4c026dce63181d295979bbb not found: ID does not exist" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.407362 4747 scope.go:117] "RemoveContainer" containerID="6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9" Dec 05 22:16:30 crc kubenswrapper[4747]: E1205 22:16:30.407734 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9\": container with ID starting with 6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9 not found: ID does not exist" containerID="6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9" Dec 05 22:16:30 crc kubenswrapper[4747]: I1205 22:16:30.407775 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9"} err="failed to get container status \"6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9\": rpc error: code = NotFound desc = could not find container \"6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9\": container with ID starting with 6a9426e2b2ec833c8db09ea5070dae47be381a9bd1d97590d8cbc8cb563a6be9 not found: ID does not exist" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.131896 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" (UID: "4f182d6d-87a9-4787-8af1-b4e5a6efc2ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.172439 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.234718 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.243066 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6dtk"] Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.304652 4747 generic.go:334] "Generic (PLEG): container finished" podID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerID="d0cac9ff053f4f03667688683f1c1f1d1b1f69351ed830c29655ea7c530c377a" exitCode=0 Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.304700 4747 generic.go:334] "Generic (PLEG): container finished" podID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerID="625ce199d1b820a84d47970e7e8f675f75db08bb7d99bc2fbb0f7d9dbfb52f4a" exitCode=143 Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.304735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerDied","Data":"d0cac9ff053f4f03667688683f1c1f1d1b1f69351ed830c29655ea7c530c377a"} Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.304774 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerDied","Data":"625ce199d1b820a84d47970e7e8f675f75db08bb7d99bc2fbb0f7d9dbfb52f4a"} Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.849433 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" path="/var/lib/kubelet/pods/4f182d6d-87a9-4787-8af1-b4e5a6efc2ed/volumes" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.922854 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.985675 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.985996 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dd4z\" (UniqueName: \"kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.986075 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.986251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.986701 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.986886 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.986967 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.987075 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs" (OuterVolumeSpecName: "logs") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.987178 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom\") pod \"1334da29-30a0-4133-a033-f500b2f0f1e3\" (UID: \"1334da29-30a0-4133-a033-f500b2f0f1e3\") " Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.987748 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1334da29-30a0-4133-a033-f500b2f0f1e3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.987852 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1334da29-30a0-4133-a033-f500b2f0f1e3-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.992761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:31 crc kubenswrapper[4747]: I1205 22:16:31.993058 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z" (OuterVolumeSpecName: "kube-api-access-7dd4z") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "kube-api-access-7dd4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.007754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts" (OuterVolumeSpecName: "scripts") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.016829 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.043637 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data" (OuterVolumeSpecName: "config-data") pod "1334da29-30a0-4133-a033-f500b2f0f1e3" (UID: "1334da29-30a0-4133-a033-f500b2f0f1e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.089947 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.090002 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.090023 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.090043 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1334da29-30a0-4133-a033-f500b2f0f1e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.090060 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dd4z\" (UniqueName: \"kubernetes.io/projected/1334da29-30a0-4133-a033-f500b2f0f1e3-kube-api-access-7dd4z\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.318314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1334da29-30a0-4133-a033-f500b2f0f1e3","Type":"ContainerDied","Data":"cf086d60f8ac4393979d4c3cd36b4e0842feb4b2c532acad420105f0c19769b3"} Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.318374 4747 scope.go:117] "RemoveContainer" containerID="d0cac9ff053f4f03667688683f1c1f1d1b1f69351ed830c29655ea7c530c377a" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.318417 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.342954 4747 scope.go:117] "RemoveContainer" containerID="625ce199d1b820a84d47970e7e8f675f75db08bb7d99bc2fbb0f7d9dbfb52f4a" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.369330 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.377107 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.395964 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396397 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="extract-content" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396418 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="extract-content" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396437 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396446 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396458 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396466 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396476 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="extract-content" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396483 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="extract-content" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396496 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="extract-utilities" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396504 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="extract-utilities" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396521 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396529 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396554 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="extract-utilities" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396561 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="extract-utilities" Dec 05 22:16:32 crc kubenswrapper[4747]: E1205 22:16:32.396574 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api-log" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396598 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api-log" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396791 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396812 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" containerName="cinder-api-log" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396825 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0959e0ce-cd05-437a-a4b0-41606a7ab425" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.396847 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f182d6d-87a9-4787-8af1-b4e5a6efc2ed" containerName="registry-server" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.401689 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.405332 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.405413 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.405474 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.405505 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.405613 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.406329 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wz7bv" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.410883 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496328 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496416 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496507 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496625 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496713 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.496802 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x5jw\" (UniqueName: \"kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.597946 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598019 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x5jw\" (UniqueName: \"kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598077 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598098 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598116 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598158 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598222 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.598615 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.602277 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.602681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.607907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.608234 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.608313 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.610784 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.613965 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x5jw\" (UniqueName: \"kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw\") pod \"cinder-api-0\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " pod="openstack/cinder-api-0" Dec 05 22:16:32 crc kubenswrapper[4747]: I1205 22:16:32.719384 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:16:33 crc kubenswrapper[4747]: W1205 22:16:33.178961 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1984500d_bdf2_4311_a68d_ecbb5db267ff.slice/crio-41e857ec1d1ac60f9b59f81276cc607e449690d71d51a70c2da173fdf3075e0d WatchSource:0}: Error finding container 41e857ec1d1ac60f9b59f81276cc607e449690d71d51a70c2da173fdf3075e0d: Status 404 returned error can't find the container with id 41e857ec1d1ac60f9b59f81276cc607e449690d71d51a70c2da173fdf3075e0d Dec 05 22:16:33 crc kubenswrapper[4747]: I1205 22:16:33.180880 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:16:33 crc kubenswrapper[4747]: I1205 22:16:33.339004 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerStarted","Data":"41e857ec1d1ac60f9b59f81276cc607e449690d71d51a70c2da173fdf3075e0d"} Dec 05 22:16:33 crc kubenswrapper[4747]: I1205 22:16:33.854674 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1334da29-30a0-4133-a033-f500b2f0f1e3" path="/var/lib/kubelet/pods/1334da29-30a0-4133-a033-f500b2f0f1e3/volumes" Dec 05 22:16:34 crc kubenswrapper[4747]: I1205 22:16:34.353625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerStarted","Data":"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261"} Dec 05 22:16:35 crc kubenswrapper[4747]: I1205 22:16:35.378807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerStarted","Data":"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1"} Dec 05 22:16:35 crc kubenswrapper[4747]: I1205 22:16:35.379120 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 22:16:35 crc kubenswrapper[4747]: I1205 22:16:35.429049 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.42901139 podStartE2EDuration="3.42901139s" podCreationTimestamp="2025-12-05 22:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:16:35.406205975 +0000 UTC m=+5665.873513473" watchObservedRunningTime="2025-12-05 22:16:35.42901139 +0000 UTC m=+5665.896318948" Dec 05 22:16:36 crc kubenswrapper[4747]: I1205 22:16:36.222233 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:16:36 crc kubenswrapper[4747]: I1205 22:16:36.222390 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:16:36 crc kubenswrapper[4747]: I1205 22:16:36.939852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.054276 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.054750 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="dnsmasq-dns" containerID="cri-o://2d567a961a38931566853334d0a484befbc1e29a6f9c21fd786881c4a2661fa7" gracePeriod=10 Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.401816 4747 generic.go:334] "Generic (PLEG): container finished" podID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerID="2d567a961a38931566853334d0a484befbc1e29a6f9c21fd786881c4a2661fa7" exitCode=0 Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.401899 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" event={"ID":"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8","Type":"ContainerDied","Data":"2d567a961a38931566853334d0a484befbc1e29a6f9c21fd786881c4a2661fa7"} Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.728516 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.796288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46nm\" (UniqueName: \"kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm\") pod \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.796360 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc\") pod \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.796387 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb\") pod \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.796452 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config\") pod \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.796532 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb\") pod \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\" (UID: \"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8\") " Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.802940 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm" (OuterVolumeSpecName: "kube-api-access-r46nm") pod "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" (UID: "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8"). InnerVolumeSpecName "kube-api-access-r46nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.847441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" (UID: "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.864846 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config" (OuterVolumeSpecName: "config") pod "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" (UID: "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.866190 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" (UID: "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.882139 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" (UID: "c8c9a19f-48b0-49eb-8bc4-2b662a021ed8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.898793 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.898835 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.898851 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46nm\" (UniqueName: \"kubernetes.io/projected/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-kube-api-access-r46nm\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.898863 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:37 crc kubenswrapper[4747]: I1205 22:16:37.898873 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.419653 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" event={"ID":"c8c9a19f-48b0-49eb-8bc4-2b662a021ed8","Type":"ContainerDied","Data":"429dc3179cd288aeea9ecf648fd03410d7f0abe5ae9ac1244d1c5e0d988365d9"} Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.419711 4747 scope.go:117] "RemoveContainer" containerID="2d567a961a38931566853334d0a484befbc1e29a6f9c21fd786881c4a2661fa7" Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.419877 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc65b979-wq4pz" Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.487261 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.492264 4747 scope.go:117] "RemoveContainer" containerID="8709f53a8e55070c4068fe5d584c1e9d0f161a9a20198f7c2bbd7662dd2cb7ef" Dec 05 22:16:38 crc kubenswrapper[4747]: I1205 22:16:38.497068 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc65b979-wq4pz"] Dec 05 22:16:39 crc kubenswrapper[4747]: I1205 22:16:39.862144 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" path="/var/lib/kubelet/pods/c8c9a19f-48b0-49eb-8bc4-2b662a021ed8/volumes" Dec 05 22:16:44 crc kubenswrapper[4747]: I1205 22:16:44.596901 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.946951 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:16:55 crc kubenswrapper[4747]: E1205 22:16:55.948230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="init" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.948252 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="init" Dec 05 22:16:55 crc kubenswrapper[4747]: E1205 22:16:55.948310 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="dnsmasq-dns" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.948321 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="dnsmasq-dns" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.948617 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c9a19f-48b0-49eb-8bc4-2b662a021ed8" containerName="dnsmasq-dns" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.950568 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:55 crc kubenswrapper[4747]: I1205 22:16:55.958813 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.085778 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.086146 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.086259 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z24rz\" (UniqueName: \"kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.187245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.187341 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.187410 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z24rz\" (UniqueName: \"kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.187994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.188046 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.216341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z24rz\" (UniqueName: \"kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz\") pod \"redhat-marketplace-42zcn\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.301931 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:16:56 crc kubenswrapper[4747]: I1205 22:16:56.737173 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:16:57 crc kubenswrapper[4747]: I1205 22:16:57.608069 4747 generic.go:334] "Generic (PLEG): container finished" podID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerID="a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a" exitCode=0 Dec 05 22:16:57 crc kubenswrapper[4747]: I1205 22:16:57.608144 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerDied","Data":"a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a"} Dec 05 22:16:57 crc kubenswrapper[4747]: I1205 22:16:57.609618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerStarted","Data":"24042c9c2d4c2067f1a1debd38675acca5f3e3d3f8eaf1d45fa1a8e1a5423caf"} Dec 05 22:16:58 crc kubenswrapper[4747]: I1205 22:16:58.620665 4747 generic.go:334] "Generic (PLEG): container finished" podID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerID="f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338" exitCode=0 Dec 05 22:16:58 crc kubenswrapper[4747]: I1205 22:16:58.620759 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerDied","Data":"f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338"} Dec 05 22:16:59 crc kubenswrapper[4747]: I1205 22:16:59.632022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerStarted","Data":"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c"} Dec 05 22:16:59 crc kubenswrapper[4747]: I1205 22:16:59.662488 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-42zcn" podStartSLOduration=3.221028079 podStartE2EDuration="4.662473594s" podCreationTimestamp="2025-12-05 22:16:55 +0000 UTC" firstStartedPulling="2025-12-05 22:16:57.611346529 +0000 UTC m=+5688.078654057" lastFinishedPulling="2025-12-05 22:16:59.052792074 +0000 UTC m=+5689.520099572" observedRunningTime="2025-12-05 22:16:59.65905406 +0000 UTC m=+5690.126361568" watchObservedRunningTime="2025-12-05 22:16:59.662473594 +0000 UTC m=+5690.129781082" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.605532 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.607505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.613964 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.619106 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809397 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809614 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809670 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6q6\" (UniqueName: \"kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809754 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.809802 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.910923 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.910980 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.911015 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6q6\" (UniqueName: \"kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.911069 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.911097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.911194 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.911528 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.916627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.917444 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.919727 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.935425 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.936548 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6q6\" (UniqueName: \"kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6\") pod \"cinder-scheduler-0\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:01 crc kubenswrapper[4747]: I1205 22:17:01.941927 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:02 crc kubenswrapper[4747]: I1205 22:17:02.449730 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:02 crc kubenswrapper[4747]: W1205 22:17:02.458764 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda59826d8_6177_41ae_9322_4c4800598ba2.slice/crio-6b5a3d08f3f038765d0164d6852ca1fea463697ae122868a52c2e7d79aeb449e WatchSource:0}: Error finding container 6b5a3d08f3f038765d0164d6852ca1fea463697ae122868a52c2e7d79aeb449e: Status 404 returned error can't find the container with id 6b5a3d08f3f038765d0164d6852ca1fea463697ae122868a52c2e7d79aeb449e Dec 05 22:17:02 crc kubenswrapper[4747]: I1205 22:17:02.655831 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerStarted","Data":"6b5a3d08f3f038765d0164d6852ca1fea463697ae122868a52c2e7d79aeb449e"} Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.013981 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.014613 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api-log" containerID="cri-o://8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261" gracePeriod=30 Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.014742 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api" containerID="cri-o://1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1" gracePeriod=30 Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.685441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerStarted","Data":"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278"} Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.687347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerStarted","Data":"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa"} Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.689615 4747 generic.go:334] "Generic (PLEG): container finished" podID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerID="8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261" exitCode=143 Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.689718 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerDied","Data":"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261"} Dec 05 22:17:03 crc kubenswrapper[4747]: I1205 22:17:03.707617 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.707579659 podStartE2EDuration="2.707579659s" podCreationTimestamp="2025-12-05 22:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:03.705076827 +0000 UTC m=+5694.172384335" watchObservedRunningTime="2025-12-05 22:17:03.707579659 +0000 UTC m=+5694.174887157" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.222804 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.223291 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.223375 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.224633 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.224754 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" gracePeriod=600 Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.302394 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.303080 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:06 crc kubenswrapper[4747]: E1205 22:17:06.357981 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.370287 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.670446 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.720467 4747 generic.go:334] "Generic (PLEG): container finished" podID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerID="1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1" exitCode=0 Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.720517 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.720591 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerDied","Data":"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1"} Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.720620 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1984500d-bdf2-4311-a68d-ecbb5db267ff","Type":"ContainerDied","Data":"41e857ec1d1ac60f9b59f81276cc607e449690d71d51a70c2da173fdf3075e0d"} Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.720637 4747 scope.go:117] "RemoveContainer" containerID="1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.723472 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" exitCode=0 Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.723568 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001"} Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.724124 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:17:06 crc kubenswrapper[4747]: E1205 22:17:06.724330 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.745128 4747 scope.go:117] "RemoveContainer" containerID="8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.775203 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.777170 4747 scope.go:117] "RemoveContainer" containerID="1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1" Dec 05 22:17:06 crc kubenswrapper[4747]: E1205 22:17:06.777602 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1\": container with ID starting with 1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1 not found: ID does not exist" containerID="1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.777636 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1"} err="failed to get container status \"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1\": rpc error: code = NotFound desc = could not find container \"1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1\": container with ID starting with 1208317768f303c3af437093d0a92dbfb0ff3a40dc447abda50b526db6f0d1d1 not found: ID does not exist" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.777662 4747 scope.go:117] "RemoveContainer" containerID="8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261" Dec 05 22:17:06 crc kubenswrapper[4747]: E1205 22:17:06.778108 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261\": container with ID starting with 8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261 not found: ID does not exist" containerID="8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.778141 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261"} err="failed to get container status \"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261\": rpc error: code = NotFound desc = could not find container \"8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261\": container with ID starting with 8408c923339660e810361bbc1aee6d78bebf1e67ac2e4a25afd29593b377f261 not found: ID does not exist" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.778159 4747 scope.go:117] "RemoveContainer" containerID="8d02af375e89c74f19ccf61ae35fee2eef75c6582bfac65f00c356eb657c1f9d" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802277 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802705 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802740 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802818 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802874 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x5jw\" (UniqueName: \"kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802947 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.802991 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.803044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.803082 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs\") pod \"1984500d-bdf2-4311-a68d-ecbb5db267ff\" (UID: \"1984500d-bdf2-4311-a68d-ecbb5db267ff\") " Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.803758 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.804241 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs" (OuterVolumeSpecName: "logs") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.813715 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.813763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts" (OuterVolumeSpecName: "scripts") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.822137 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw" (OuterVolumeSpecName: "kube-api-access-4x5jw") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "kube-api-access-4x5jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.849092 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.873957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.876052 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data" (OuterVolumeSpecName: "config-data") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.887481 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.892869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1984500d-bdf2-4311-a68d-ecbb5db267ff" (UID: "1984500d-bdf2-4311-a68d-ecbb5db267ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905407 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905451 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905461 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905469 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1984500d-bdf2-4311-a68d-ecbb5db267ff-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905478 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905485 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905493 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1984500d-bdf2-4311-a68d-ecbb5db267ff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905502 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1984500d-bdf2-4311-a68d-ecbb5db267ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.905523 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x5jw\" (UniqueName: \"kubernetes.io/projected/1984500d-bdf2-4311-a68d-ecbb5db267ff-kube-api-access-4x5jw\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:06 crc kubenswrapper[4747]: I1205 22:17:06.943035 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.059054 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.075883 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.089790 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:07 crc kubenswrapper[4747]: E1205 22:17:07.090228 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api-log" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.090251 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api-log" Dec 05 22:17:07 crc kubenswrapper[4747]: E1205 22:17:07.090287 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.090297 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.090498 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api-log" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.090537 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" containerName="cinder-api" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.091685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.094957 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.094981 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.096275 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.097155 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213045 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213119 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdq2\" (UniqueName: \"kubernetes.io/projected/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-kube-api-access-7vdq2\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213201 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213269 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-logs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213370 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213410 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-scripts\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213437 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213627 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.213673 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.314992 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdq2\" (UniqueName: \"kubernetes.io/projected/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-kube-api-access-7vdq2\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.315778 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.315862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-logs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.315909 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316035 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-scripts\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316143 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316222 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316322 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.316366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-logs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.321753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.322307 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.322505 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.323301 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.323339 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.324470 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-scripts\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.340114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdq2\" (UniqueName: \"kubernetes.io/projected/925daa00-d5f8-40fb-8ac6-dcab357bdd5b-kube-api-access-7vdq2\") pod \"cinder-api-0\" (UID: \"925daa00-d5f8-40fb-8ac6-dcab357bdd5b\") " pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.406684 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.853264 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1984500d-bdf2-4311-a68d-ecbb5db267ff" path="/var/lib/kubelet/pods/1984500d-bdf2-4311-a68d-ecbb5db267ff/volumes" Dec 05 22:17:07 crc kubenswrapper[4747]: I1205 22:17:07.900135 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 22:17:07 crc kubenswrapper[4747]: W1205 22:17:07.908529 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod925daa00_d5f8_40fb_8ac6_dcab357bdd5b.slice/crio-15fb850adde58ed9705d8e2ea807c7cd113593a137657894e11b5b63d7badfa1 WatchSource:0}: Error finding container 15fb850adde58ed9705d8e2ea807c7cd113593a137657894e11b5b63d7badfa1: Status 404 returned error can't find the container with id 15fb850adde58ed9705d8e2ea807c7cd113593a137657894e11b5b63d7badfa1 Dec 05 22:17:08 crc kubenswrapper[4747]: I1205 22:17:08.749578 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-42zcn" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="registry-server" containerID="cri-o://5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c" gracePeriod=2 Dec 05 22:17:08 crc kubenswrapper[4747]: I1205 22:17:08.749842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"925daa00-d5f8-40fb-8ac6-dcab357bdd5b","Type":"ContainerStarted","Data":"2e932d8fbfd9585faf9845ecb0914d31523ed590ac34490a40db32b8a7d1b120"} Dec 05 22:17:08 crc kubenswrapper[4747]: I1205 22:17:08.749912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"925daa00-d5f8-40fb-8ac6-dcab357bdd5b","Type":"ContainerStarted","Data":"15fb850adde58ed9705d8e2ea807c7cd113593a137657894e11b5b63d7badfa1"} Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.094721 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.253959 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z24rz\" (UniqueName: \"kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz\") pod \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.254171 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content\") pod \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.254251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities\") pod \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\" (UID: \"6004ff1a-33d1-4e88-ad69-6de1a34ca34d\") " Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.255923 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities" (OuterVolumeSpecName: "utilities") pod "6004ff1a-33d1-4e88-ad69-6de1a34ca34d" (UID: "6004ff1a-33d1-4e88-ad69-6de1a34ca34d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.262033 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz" (OuterVolumeSpecName: "kube-api-access-z24rz") pod "6004ff1a-33d1-4e88-ad69-6de1a34ca34d" (UID: "6004ff1a-33d1-4e88-ad69-6de1a34ca34d"). InnerVolumeSpecName "kube-api-access-z24rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.271769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6004ff1a-33d1-4e88-ad69-6de1a34ca34d" (UID: "6004ff1a-33d1-4e88-ad69-6de1a34ca34d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.357027 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.357052 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.357934 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z24rz\" (UniqueName: \"kubernetes.io/projected/6004ff1a-33d1-4e88-ad69-6de1a34ca34d-kube-api-access-z24rz\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.764182 4747 generic.go:334] "Generic (PLEG): container finished" podID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerID="5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c" exitCode=0 Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.764260 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42zcn" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.764288 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerDied","Data":"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c"} Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.764329 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42zcn" event={"ID":"6004ff1a-33d1-4e88-ad69-6de1a34ca34d","Type":"ContainerDied","Data":"24042c9c2d4c2067f1a1debd38675acca5f3e3d3f8eaf1d45fa1a8e1a5423caf"} Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.764359 4747 scope.go:117] "RemoveContainer" containerID="5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.768228 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"925daa00-d5f8-40fb-8ac6-dcab357bdd5b","Type":"ContainerStarted","Data":"33f935f32f4ba18dec6722e2bd5ea9835edc4a8c03955cf9aaa7fd7ce4fe30b1"} Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.768579 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.791997 4747 scope.go:117] "RemoveContainer" containerID="f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.805086 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.8050255589999997 podStartE2EDuration="2.805025559s" podCreationTimestamp="2025-12-05 22:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:09.799659376 +0000 UTC m=+5700.266966894" watchObservedRunningTime="2025-12-05 22:17:09.805025559 +0000 UTC m=+5700.272333077" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.836998 4747 scope.go:117] "RemoveContainer" containerID="a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.889269 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.902989 4747 scope.go:117] "RemoveContainer" containerID="5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.904255 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-42zcn"] Dec 05 22:17:09 crc kubenswrapper[4747]: E1205 22:17:09.905316 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c\": container with ID starting with 5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c not found: ID does not exist" containerID="5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.905393 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c"} err="failed to get container status \"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c\": rpc error: code = NotFound desc = could not find container \"5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c\": container with ID starting with 5e5ca237b3ae66012a2276f14f1924dcdb6e84cac75c2c89e24dfd033d67b79c not found: ID does not exist" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.905427 4747 scope.go:117] "RemoveContainer" containerID="f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338" Dec 05 22:17:09 crc kubenswrapper[4747]: E1205 22:17:09.905845 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338\": container with ID starting with f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338 not found: ID does not exist" containerID="f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.905887 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338"} err="failed to get container status \"f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338\": rpc error: code = NotFound desc = could not find container \"f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338\": container with ID starting with f87eb67e5bb0be65ac06c90d9325ace193a45be0b702da1a6c00975efb76c338 not found: ID does not exist" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.905917 4747 scope.go:117] "RemoveContainer" containerID="a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a" Dec 05 22:17:09 crc kubenswrapper[4747]: E1205 22:17:09.906240 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a\": container with ID starting with a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a not found: ID does not exist" containerID="a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a" Dec 05 22:17:09 crc kubenswrapper[4747]: I1205 22:17:09.906277 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a"} err="failed to get container status \"a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a\": rpc error: code = NotFound desc = could not find container \"a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a\": container with ID starting with a9aa54ada150db88e8ab3a993ed4b1c1e3284d7efa9b66aa959f6e2b6dd9dc9a not found: ID does not exist" Dec 05 22:17:11 crc kubenswrapper[4747]: I1205 22:17:11.861427 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" path="/var/lib/kubelet/pods/6004ff1a-33d1-4e88-ad69-6de1a34ca34d/volumes" Dec 05 22:17:12 crc kubenswrapper[4747]: I1205 22:17:12.209426 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 22:17:12 crc kubenswrapper[4747]: I1205 22:17:12.261597 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:12 crc kubenswrapper[4747]: I1205 22:17:12.799453 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="cinder-scheduler" containerID="cri-o://937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa" gracePeriod=30 Dec 05 22:17:12 crc kubenswrapper[4747]: I1205 22:17:12.799501 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="probe" containerID="cri-o://9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278" gracePeriod=30 Dec 05 22:17:13 crc kubenswrapper[4747]: I1205 22:17:13.809524 4747 generic.go:334] "Generic (PLEG): container finished" podID="a59826d8-6177-41ae-9322-4c4800598ba2" containerID="9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278" exitCode=0 Dec 05 22:17:13 crc kubenswrapper[4747]: I1205 22:17:13.809598 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerDied","Data":"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278"} Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.385646 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.467894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.467954 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.467986 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd6q6\" (UniqueName: \"kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.468010 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.468062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.468117 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data\") pod \"a59826d8-6177-41ae-9322-4c4800598ba2\" (UID: \"a59826d8-6177-41ae-9322-4c4800598ba2\") " Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.468529 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.473634 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts" (OuterVolumeSpecName: "scripts") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.484957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6" (OuterVolumeSpecName: "kube-api-access-fd6q6") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "kube-api-access-fd6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.486829 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.525924 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570357 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570389 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570399 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd6q6\" (UniqueName: \"kubernetes.io/projected/a59826d8-6177-41ae-9322-4c4800598ba2-kube-api-access-fd6q6\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570410 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570419 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59826d8-6177-41ae-9322-4c4800598ba2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.570938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data" (OuterVolumeSpecName: "config-data") pod "a59826d8-6177-41ae-9322-4c4800598ba2" (UID: "a59826d8-6177-41ae-9322-4c4800598ba2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.672255 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59826d8-6177-41ae-9322-4c4800598ba2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.822077 4747 generic.go:334] "Generic (PLEG): container finished" podID="a59826d8-6177-41ae-9322-4c4800598ba2" containerID="937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa" exitCode=0 Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.822179 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.822234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerDied","Data":"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa"} Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.822318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a59826d8-6177-41ae-9322-4c4800598ba2","Type":"ContainerDied","Data":"6b5a3d08f3f038765d0164d6852ca1fea463697ae122868a52c2e7d79aeb449e"} Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.822380 4747 scope.go:117] "RemoveContainer" containerID="9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.850275 4747 scope.go:117] "RemoveContainer" containerID="937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.868294 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.877512 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.882390 4747 scope.go:117] "RemoveContainer" containerID="9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.883777 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278\": container with ID starting with 9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278 not found: ID does not exist" containerID="9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.883843 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278"} err="failed to get container status \"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278\": rpc error: code = NotFound desc = could not find container \"9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278\": container with ID starting with 9a0a18911da31c88e78b5ff98bdae4d7a1fd0c28d65bb824ad365d1847934278 not found: ID does not exist" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.883888 4747 scope.go:117] "RemoveContainer" containerID="937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.884863 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa\": container with ID starting with 937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa not found: ID does not exist" containerID="937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.884894 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa"} err="failed to get container status \"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa\": rpc error: code = NotFound desc = could not find container \"937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa\": container with ID starting with 937dd27f11c77d7cfb03b96dc47e66540886ea7c476f879350adb97fbc4ec5aa not found: ID does not exist" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.911762 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.912229 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="registry-server" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912251 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="registry-server" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.912279 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="cinder-scheduler" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912288 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="cinder-scheduler" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.912311 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="extract-content" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912319 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="extract-content" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.912343 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="extract-utilities" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912351 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="extract-utilities" Dec 05 22:17:14 crc kubenswrapper[4747]: E1205 22:17:14.912368 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="probe" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912376 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="probe" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912623 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6004ff1a-33d1-4e88-ad69-6de1a34ca34d" containerName="registry-server" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912648 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="probe" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.912660 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" containerName="cinder-scheduler" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.913838 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.916751 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 22:17:14 crc kubenswrapper[4747]: I1205 22:17:14.920564 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080081 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqr4\" (UniqueName: \"kubernetes.io/projected/9116d646-20e0-4541-8233-7c6a101b5e40-kube-api-access-zbqr4\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080278 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9116d646-20e0-4541-8233-7c6a101b5e40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080309 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-scripts\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080335 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.080429 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqr4\" (UniqueName: \"kubernetes.io/projected/9116d646-20e0-4541-8233-7c6a101b5e40-kube-api-access-zbqr4\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182392 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9116d646-20e0-4541-8233-7c6a101b5e40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-scripts\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182463 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.182527 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9116d646-20e0-4541-8233-7c6a101b5e40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.187980 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.188172 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.192425 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.196221 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9116d646-20e0-4541-8233-7c6a101b5e40-scripts\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.200809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqr4\" (UniqueName: \"kubernetes.io/projected/9116d646-20e0-4541-8233-7c6a101b5e40-kube-api-access-zbqr4\") pod \"cinder-scheduler-0\" (UID: \"9116d646-20e0-4541-8233-7c6a101b5e40\") " pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.264989 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.706735 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.835109 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9116d646-20e0-4541-8233-7c6a101b5e40","Type":"ContainerStarted","Data":"020be8f7946112d9a0d57e4493c72ee2d4d78109eb7f8e851bb9e0839effb284"} Dec 05 22:17:15 crc kubenswrapper[4747]: I1205 22:17:15.854127 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59826d8-6177-41ae-9322-4c4800598ba2" path="/var/lib/kubelet/pods/a59826d8-6177-41ae-9322-4c4800598ba2/volumes" Dec 05 22:17:16 crc kubenswrapper[4747]: I1205 22:17:16.861141 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9116d646-20e0-4541-8233-7c6a101b5e40","Type":"ContainerStarted","Data":"28c89f5d50ebd3267b6554169ad1c26b364ddbe178e45cb39a331d21a7772bdf"} Dec 05 22:17:17 crc kubenswrapper[4747]: I1205 22:17:17.871950 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9116d646-20e0-4541-8233-7c6a101b5e40","Type":"ContainerStarted","Data":"24996ce17b21feb6f200929d9b9edfe578e2c69f9c4ade27bc6fc7b1d9b3d7c7"} Dec 05 22:17:17 crc kubenswrapper[4747]: I1205 22:17:17.905335 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.905312877 podStartE2EDuration="3.905312877s" podCreationTimestamp="2025-12-05 22:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:17.900056407 +0000 UTC m=+5708.367363895" watchObservedRunningTime="2025-12-05 22:17:17.905312877 +0000 UTC m=+5708.372620375" Dec 05 22:17:19 crc kubenswrapper[4747]: I1205 22:17:19.217296 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 22:17:19 crc kubenswrapper[4747]: I1205 22:17:19.847560 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:17:19 crc kubenswrapper[4747]: E1205 22:17:19.847870 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:17:20 crc kubenswrapper[4747]: I1205 22:17:20.265721 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 22:17:25 crc kubenswrapper[4747]: I1205 22:17:25.550393 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.024328 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ndzms"] Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.025940 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.035978 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ndzms"] Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.129420 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlkb\" (UniqueName: \"kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.129494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.133203 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b40a-account-create-update-qvwpj"] Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.134899 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.136949 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.158667 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b40a-account-create-update-qvwpj"] Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.231405 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.231507 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlkb\" (UniqueName: \"kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.231560 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.231609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjvh\" (UniqueName: \"kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.232731 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.249416 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljlkb\" (UniqueName: \"kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb\") pod \"glance-db-create-ndzms\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.333350 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjvh\" (UniqueName: \"kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.333500 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.334361 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.345530 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ndzms" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.355282 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjvh\" (UniqueName: \"kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh\") pod \"glance-b40a-account-create-update-qvwpj\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.458967 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.835513 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ndzms"] Dec 05 22:17:28 crc kubenswrapper[4747]: W1205 22:17:28.840632 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6483ce8_8e70_4b43_985e_3b631c83588f.slice/crio-97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b WatchSource:0}: Error finding container 97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b: Status 404 returned error can't find the container with id 97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.950796 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b40a-account-create-update-qvwpj"] Dec 05 22:17:28 crc kubenswrapper[4747]: W1205 22:17:28.963306 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403e6b20_c7e9_47f0_8d6a_e1b9a5a6e57e.slice/crio-44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181 WatchSource:0}: Error finding container 44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181: Status 404 returned error can't find the container with id 44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181 Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.974806 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ndzms" event={"ID":"f6483ce8-8e70-4b43-985e-3b631c83588f","Type":"ContainerStarted","Data":"97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b"} Dec 05 22:17:28 crc kubenswrapper[4747]: I1205 22:17:28.976103 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b40a-account-create-update-qvwpj" event={"ID":"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e","Type":"ContainerStarted","Data":"44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181"} Dec 05 22:17:29 crc kubenswrapper[4747]: I1205 22:17:29.896270 4747 scope.go:117] "RemoveContainer" containerID="9f021ef4e23e1b338cc303cdafd313e622f6613e22b8426b355b12ed35aa410d" Dec 05 22:17:29 crc kubenswrapper[4747]: I1205 22:17:29.987848 4747 generic.go:334] "Generic (PLEG): container finished" podID="f6483ce8-8e70-4b43-985e-3b631c83588f" containerID="07e7f3cfca0a21fa1397f6d900099523df25a689fec7eb18818130c146287456" exitCode=0 Dec 05 22:17:29 crc kubenswrapper[4747]: I1205 22:17:29.987890 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ndzms" event={"ID":"f6483ce8-8e70-4b43-985e-3b631c83588f","Type":"ContainerDied","Data":"07e7f3cfca0a21fa1397f6d900099523df25a689fec7eb18818130c146287456"} Dec 05 22:17:29 crc kubenswrapper[4747]: I1205 22:17:29.989527 4747 generic.go:334] "Generic (PLEG): container finished" podID="403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" containerID="5780bb4339707268fa0197f40f371dadececeb8c895835934ab6a083469d2fbe" exitCode=0 Dec 05 22:17:29 crc kubenswrapper[4747]: I1205 22:17:29.989556 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b40a-account-create-update-qvwpj" event={"ID":"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e","Type":"ContainerDied","Data":"5780bb4339707268fa0197f40f371dadececeb8c895835934ab6a083469d2fbe"} Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.476738 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.484096 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ndzms" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.616350 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljlkb\" (UniqueName: \"kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb\") pod \"f6483ce8-8e70-4b43-985e-3b631c83588f\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.616469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts\") pod \"f6483ce8-8e70-4b43-985e-3b631c83588f\" (UID: \"f6483ce8-8e70-4b43-985e-3b631c83588f\") " Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.616491 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts\") pod \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.616625 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjvh\" (UniqueName: \"kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh\") pod \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\" (UID: \"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e\") " Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.617366 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6483ce8-8e70-4b43-985e-3b631c83588f" (UID: "f6483ce8-8e70-4b43-985e-3b631c83588f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.617543 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" (UID: "403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.623572 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh" (OuterVolumeSpecName: "kube-api-access-qqjvh") pod "403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" (UID: "403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e"). InnerVolumeSpecName "kube-api-access-qqjvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.625037 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb" (OuterVolumeSpecName: "kube-api-access-ljlkb") pod "f6483ce8-8e70-4b43-985e-3b631c83588f" (UID: "f6483ce8-8e70-4b43-985e-3b631c83588f"). InnerVolumeSpecName "kube-api-access-ljlkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.719560 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjvh\" (UniqueName: \"kubernetes.io/projected/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-kube-api-access-qqjvh\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.719621 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljlkb\" (UniqueName: \"kubernetes.io/projected/f6483ce8-8e70-4b43-985e-3b631c83588f-kube-api-access-ljlkb\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.719668 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6483ce8-8e70-4b43-985e-3b631c83588f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:31 crc kubenswrapper[4747]: I1205 22:17:31.719682 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.019483 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b40a-account-create-update-qvwpj" Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.019466 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b40a-account-create-update-qvwpj" event={"ID":"403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e","Type":"ContainerDied","Data":"44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181"} Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.019672 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b1ff9ca5be59c0c5e9b61d9a09f8802033c880d44688fc8c27774c15784181" Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.023057 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ndzms" event={"ID":"f6483ce8-8e70-4b43-985e-3b631c83588f","Type":"ContainerDied","Data":"97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b"} Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.023166 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ndzms" Dec 05 22:17:32 crc kubenswrapper[4747]: I1205 22:17:32.023168 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97025efd6714c16b004c412d287a5b228a1da3b4e59c758b5bb965cad60ae19b" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.330478 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lh6ts"] Dec 05 22:17:33 crc kubenswrapper[4747]: E1205 22:17:33.331217 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" containerName="mariadb-account-create-update" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.331233 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" containerName="mariadb-account-create-update" Dec 05 22:17:33 crc kubenswrapper[4747]: E1205 22:17:33.331254 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6483ce8-8e70-4b43-985e-3b631c83588f" containerName="mariadb-database-create" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.331261 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6483ce8-8e70-4b43-985e-3b631c83588f" containerName="mariadb-database-create" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.331474 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6483ce8-8e70-4b43-985e-3b631c83588f" containerName="mariadb-database-create" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.331498 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" containerName="mariadb-account-create-update" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.332203 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.335775 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dlgdh" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.339782 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.354394 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lh6ts"] Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.449583 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgrs\" (UniqueName: \"kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.449890 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.450044 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.450198 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.551884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.552021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.552059 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.552087 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgrs\" (UniqueName: \"kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.557354 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.558253 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.558353 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.572144 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgrs\" (UniqueName: \"kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs\") pod \"glance-db-sync-lh6ts\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:33 crc kubenswrapper[4747]: I1205 22:17:33.669112 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:34 crc kubenswrapper[4747]: I1205 22:17:34.201881 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lh6ts"] Dec 05 22:17:34 crc kubenswrapper[4747]: W1205 22:17:34.202502 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87ba03f2_64bd_4d96_88de_1de3b7bdaa07.slice/crio-4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf WatchSource:0}: Error finding container 4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf: Status 404 returned error can't find the container with id 4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf Dec 05 22:17:34 crc kubenswrapper[4747]: I1205 22:17:34.839333 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:17:34 crc kubenswrapper[4747]: E1205 22:17:34.839867 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:17:35 crc kubenswrapper[4747]: I1205 22:17:35.055764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lh6ts" event={"ID":"87ba03f2-64bd-4d96-88de-1de3b7bdaa07","Type":"ContainerStarted","Data":"9ffe59837d62ec8f4aa84d9e2e77ba1a6c99e76f1e1116e3c92c892b26204df1"} Dec 05 22:17:35 crc kubenswrapper[4747]: I1205 22:17:35.056018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lh6ts" event={"ID":"87ba03f2-64bd-4d96-88de-1de3b7bdaa07","Type":"ContainerStarted","Data":"4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf"} Dec 05 22:17:35 crc kubenswrapper[4747]: I1205 22:17:35.076836 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lh6ts" podStartSLOduration=2.076810968 podStartE2EDuration="2.076810968s" podCreationTimestamp="2025-12-05 22:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:35.067766214 +0000 UTC m=+5725.535073702" watchObservedRunningTime="2025-12-05 22:17:35.076810968 +0000 UTC m=+5725.544118466" Dec 05 22:17:39 crc kubenswrapper[4747]: I1205 22:17:39.093030 4747 generic.go:334] "Generic (PLEG): container finished" podID="87ba03f2-64bd-4d96-88de-1de3b7bdaa07" containerID="9ffe59837d62ec8f4aa84d9e2e77ba1a6c99e76f1e1116e3c92c892b26204df1" exitCode=0 Dec 05 22:17:39 crc kubenswrapper[4747]: I1205 22:17:39.093177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lh6ts" event={"ID":"87ba03f2-64bd-4d96-88de-1de3b7bdaa07","Type":"ContainerDied","Data":"9ffe59837d62ec8f4aa84d9e2e77ba1a6c99e76f1e1116e3c92c892b26204df1"} Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.557316 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.696738 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle\") pod \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.696879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data\") pod \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.696988 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cgrs\" (UniqueName: \"kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs\") pod \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.697010 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data\") pod \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\" (UID: \"87ba03f2-64bd-4d96-88de-1de3b7bdaa07\") " Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.701630 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87ba03f2-64bd-4d96-88de-1de3b7bdaa07" (UID: "87ba03f2-64bd-4d96-88de-1de3b7bdaa07"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.701708 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs" (OuterVolumeSpecName: "kube-api-access-4cgrs") pod "87ba03f2-64bd-4d96-88de-1de3b7bdaa07" (UID: "87ba03f2-64bd-4d96-88de-1de3b7bdaa07"). InnerVolumeSpecName "kube-api-access-4cgrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.730931 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87ba03f2-64bd-4d96-88de-1de3b7bdaa07" (UID: "87ba03f2-64bd-4d96-88de-1de3b7bdaa07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.751782 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data" (OuterVolumeSpecName: "config-data") pod "87ba03f2-64bd-4d96-88de-1de3b7bdaa07" (UID: "87ba03f2-64bd-4d96-88de-1de3b7bdaa07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.800037 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cgrs\" (UniqueName: \"kubernetes.io/projected/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-kube-api-access-4cgrs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.800097 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.800118 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:40 crc kubenswrapper[4747]: I1205 22:17:40.800136 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87ba03f2-64bd-4d96-88de-1de3b7bdaa07-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.114789 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lh6ts" event={"ID":"87ba03f2-64bd-4d96-88de-1de3b7bdaa07","Type":"ContainerDied","Data":"4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf"} Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.114830 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cc70614fe7c78c6e3753329db9a322fa3f5b489fb28ee22f1af1ed4d8a57bbf" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.115028 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lh6ts" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.448121 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:41 crc kubenswrapper[4747]: E1205 22:17:41.448539 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ba03f2-64bd-4d96-88de-1de3b7bdaa07" containerName="glance-db-sync" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.448558 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ba03f2-64bd-4d96-88de-1de3b7bdaa07" containerName="glance-db-sync" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.448839 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ba03f2-64bd-4d96-88de-1de3b7bdaa07" containerName="glance-db-sync" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.449755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.453615 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.458867 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.458890 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dlgdh" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.471510 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.517883 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.517970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.517994 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.518014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.518218 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.518295 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfwf\" (UniqueName: \"kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.538722 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.540777 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.563707 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619361 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619434 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619456 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hzn\" (UniqueName: \"kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619512 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfwf\" (UniqueName: \"kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619672 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619707 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619822 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.619987 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.625728 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.631056 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.632148 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.649229 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfwf\" (UniqueName: \"kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf\") pod \"glance-default-external-api-0\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.667700 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.673264 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.677180 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.694528 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723033 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723098 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hzn\" (UniqueName: \"kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723147 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723166 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723265 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723288 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723311 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723359 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.723396 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpn9\" (UniqueName: \"kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.724536 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.725223 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.725692 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.728538 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.742084 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hzn\" (UniqueName: \"kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn\") pod \"dnsmasq-dns-7f5bc57f77-b9vqs\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.769126 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825412 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825472 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825588 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.825636 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpn9\" (UniqueName: \"kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.826372 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.826505 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.830102 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.830702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.833151 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.844374 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpn9\" (UniqueName: \"kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9\") pod \"glance-default-internal-api-0\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:41 crc kubenswrapper[4747]: I1205 22:17:41.867047 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:42 crc kubenswrapper[4747]: I1205 22:17:42.012197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:42 crc kubenswrapper[4747]: I1205 22:17:42.348030 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:42 crc kubenswrapper[4747]: I1205 22:17:42.475384 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:17:42 crc kubenswrapper[4747]: W1205 22:17:42.485760 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf9279bb_2f0c_48ea_a3dc_4f698b4b9a0c.slice/crio-c4e26ecadad5c679806b8ca3a00652b43554e52c65ab37747a7966bb21465f12 WatchSource:0}: Error finding container c4e26ecadad5c679806b8ca3a00652b43554e52c65ab37747a7966bb21465f12: Status 404 returned error can't find the container with id c4e26ecadad5c679806b8ca3a00652b43554e52c65ab37747a7966bb21465f12 Dec 05 22:17:42 crc kubenswrapper[4747]: I1205 22:17:42.777695 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:42 crc kubenswrapper[4747]: I1205 22:17:42.899524 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.143120 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerStarted","Data":"64e1f40e8256abc289f7c55ece640890059eff5fac3c3dd6d7b58eb179361b94"} Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.146502 4747 generic.go:334] "Generic (PLEG): container finished" podID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerID="bd05469d2afed8d3ef952a3b52c80714582b0c03a54cf1faddcd868fa722076f" exitCode=0 Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.146785 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" event={"ID":"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c","Type":"ContainerDied","Data":"bd05469d2afed8d3ef952a3b52c80714582b0c03a54cf1faddcd868fa722076f"} Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.146850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" event={"ID":"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c","Type":"ContainerStarted","Data":"c4e26ecadad5c679806b8ca3a00652b43554e52c65ab37747a7966bb21465f12"} Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.148824 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerStarted","Data":"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473"} Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.148858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerStarted","Data":"21df0ba4dc44bd75f7dd27dc93f1466bdc5b7676b33bf3676624fdbfffc40e67"} Dec 05 22:17:43 crc kubenswrapper[4747]: I1205 22:17:43.603682 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.159444 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerStarted","Data":"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f"} Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.159944 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerStarted","Data":"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695"} Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.159669 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-log" containerID="cri-o://2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" gracePeriod=30 Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.159614 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-httpd" containerID="cri-o://5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" gracePeriod=30 Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.163654 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" event={"ID":"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c","Type":"ContainerStarted","Data":"400d88ecfbe4118ef55474a7cf555e41df0361e6746fdf68483077a8b777f618"} Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.163807 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.165686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerStarted","Data":"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05"} Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.165814 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-log" containerID="cri-o://0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" gracePeriod=30 Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.165903 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-httpd" containerID="cri-o://178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" gracePeriod=30 Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.189368 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.189352815 podStartE2EDuration="3.189352815s" podCreationTimestamp="2025-12-05 22:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:44.188181556 +0000 UTC m=+5734.655489044" watchObservedRunningTime="2025-12-05 22:17:44.189352815 +0000 UTC m=+5734.656660303" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.217405 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.2173872 podStartE2EDuration="3.2173872s" podCreationTimestamp="2025-12-05 22:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:44.215123074 +0000 UTC m=+5734.682430562" watchObservedRunningTime="2025-12-05 22:17:44.2173872 +0000 UTC m=+5734.684694688" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.239557 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" podStartSLOduration=3.239538379 podStartE2EDuration="3.239538379s" podCreationTimestamp="2025-12-05 22:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:44.231644573 +0000 UTC m=+5734.698952071" watchObservedRunningTime="2025-12-05 22:17:44.239538379 +0000 UTC m=+5734.706845867" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.834107 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891028 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891074 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891247 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891298 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891343 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.891368 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qpn9\" (UniqueName: \"kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9\") pod \"c23f8967-faa4-4680-871c-9360f9f65249\" (UID: \"c23f8967-faa4-4680-871c-9360f9f65249\") " Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.892215 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.892237 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs" (OuterVolumeSpecName: "logs") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.903040 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts" (OuterVolumeSpecName: "scripts") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.906375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9" (OuterVolumeSpecName: "kube-api-access-7qpn9") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "kube-api-access-7qpn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.932437 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.946924 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data" (OuterVolumeSpecName: "config-data") pod "c23f8967-faa4-4680-871c-9360f9f65249" (UID: "c23f8967-faa4-4680-871c-9360f9f65249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993689 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993720 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993732 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23f8967-faa4-4680-871c-9360f9f65249-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993743 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993755 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23f8967-faa4-4680-871c-9360f9f65249-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:44 crc kubenswrapper[4747]: I1205 22:17:44.993764 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qpn9\" (UniqueName: \"kubernetes.io/projected/c23f8967-faa4-4680-871c-9360f9f65249-kube-api-access-7qpn9\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.016278 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095369 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095421 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfwf\" (UniqueName: \"kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095508 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095658 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data\") pod \"8076e610-5752-4cdc-91d1-93989f4ff726\" (UID: \"8076e610-5752-4cdc-91d1-93989f4ff726\") " Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs" (OuterVolumeSpecName: "logs") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.095838 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.096140 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.096162 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8076e610-5752-4cdc-91d1-93989f4ff726-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.100258 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts" (OuterVolumeSpecName: "scripts") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.100284 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf" (OuterVolumeSpecName: "kube-api-access-hlfwf") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "kube-api-access-hlfwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.123788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.146535 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data" (OuterVolumeSpecName: "config-data") pod "8076e610-5752-4cdc-91d1-93989f4ff726" (UID: "8076e610-5752-4cdc-91d1-93989f4ff726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178229 4747 generic.go:334] "Generic (PLEG): container finished" podID="c23f8967-faa4-4680-871c-9360f9f65249" containerID="5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" exitCode=143 Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178258 4747 generic.go:334] "Generic (PLEG): container finished" podID="c23f8967-faa4-4680-871c-9360f9f65249" containerID="2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" exitCode=143 Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerDied","Data":"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178313 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerDied","Data":"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23f8967-faa4-4680-871c-9360f9f65249","Type":"ContainerDied","Data":"64e1f40e8256abc289f7c55ece640890059eff5fac3c3dd6d7b58eb179361b94"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.178352 4747 scope.go:117] "RemoveContainer" containerID="5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182541 4747 generic.go:334] "Generic (PLEG): container finished" podID="8076e610-5752-4cdc-91d1-93989f4ff726" containerID="178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" exitCode=0 Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182624 4747 generic.go:334] "Generic (PLEG): container finished" podID="8076e610-5752-4cdc-91d1-93989f4ff726" containerID="0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" exitCode=143 Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182706 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerDied","Data":"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerDied","Data":"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.182839 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8076e610-5752-4cdc-91d1-93989f4ff726","Type":"ContainerDied","Data":"21df0ba4dc44bd75f7dd27dc93f1466bdc5b7676b33bf3676624fdbfffc40e67"} Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.198892 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.198923 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfwf\" (UniqueName: \"kubernetes.io/projected/8076e610-5752-4cdc-91d1-93989f4ff726-kube-api-access-hlfwf\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.198939 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.198950 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8076e610-5752-4cdc-91d1-93989f4ff726-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.204887 4747 scope.go:117] "RemoveContainer" containerID="2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.220969 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.230943 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.243034 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.245256 4747 scope.go:117] "RemoveContainer" containerID="5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.245634 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f\": container with ID starting with 5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f not found: ID does not exist" containerID="5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.245674 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f"} err="failed to get container status \"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f\": rpc error: code = NotFound desc = could not find container \"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f\": container with ID starting with 5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.245701 4747 scope.go:117] "RemoveContainer" containerID="2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.246019 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695\": container with ID starting with 2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695 not found: ID does not exist" containerID="2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.246047 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695"} err="failed to get container status \"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695\": rpc error: code = NotFound desc = could not find container \"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695\": container with ID starting with 2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.246066 4747 scope.go:117] "RemoveContainer" containerID="5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.246668 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f"} err="failed to get container status \"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f\": rpc error: code = NotFound desc = could not find container \"5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f\": container with ID starting with 5d51a9dbc6239d1ed1ec199bf9334bc4ec2bde68c642c5acaba0295aa587914f not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.246698 4747 scope.go:117] "RemoveContainer" containerID="2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.247691 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695"} err="failed to get container status \"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695\": rpc error: code = NotFound desc = could not find container \"2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695\": container with ID starting with 2666882c2643bd42248b7eae82b148c49bc5e0f2fd65e36cad81d3baeceea695 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.247721 4747 scope.go:117] "RemoveContainer" containerID="178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.254546 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.280288 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.280753 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.280768 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.280780 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.280789 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.280813 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.280820 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.280840 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.280847 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.281048 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.281062 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.281076 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-httpd" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.281090 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23f8967-faa4-4680-871c-9360f9f65249" containerName="glance-log" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.282255 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.284740 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.285022 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.285098 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dlgdh" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.287853 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.294086 4747 scope.go:117] "RemoveContainer" containerID="0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.303140 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.309912 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.322429 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.322986 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.334464 4747 scope.go:117] "RemoveContainer" containerID="178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.338894 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05\": container with ID starting with 178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05 not found: ID does not exist" containerID="178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.338998 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05"} err="failed to get container status \"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05\": rpc error: code = NotFound desc = could not find container \"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05\": container with ID starting with 178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.339039 4747 scope.go:117] "RemoveContainer" containerID="0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" Dec 05 22:17:45 crc kubenswrapper[4747]: E1205 22:17:45.339720 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473\": container with ID starting with 0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473 not found: ID does not exist" containerID="0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.339792 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473"} err="failed to get container status \"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473\": rpc error: code = NotFound desc = could not find container \"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473\": container with ID starting with 0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.339828 4747 scope.go:117] "RemoveContainer" containerID="178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.343084 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05"} err="failed to get container status \"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05\": rpc error: code = NotFound desc = could not find container \"178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05\": container with ID starting with 178225b4c54b014924f5182512c37063c3bb292beba17a2e1dcddac11ce8ad05 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.343127 4747 scope.go:117] "RemoveContainer" containerID="0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.343503 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473"} err="failed to get container status \"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473\": rpc error: code = NotFound desc = could not find container \"0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473\": container with ID starting with 0a3e16d0c343b6d48916a95485636734a56f764f93f4f660812d67ff95cc6473 not found: ID does not exist" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.346720 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.372138 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403742 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403800 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403829 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403902 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403919 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403935 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403968 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmsnp\" (UniqueName: \"kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.403989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9jf\" (UniqueName: \"kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404060 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404090 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404124 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.404175 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.505456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9jf\" (UniqueName: \"kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.505514 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.505545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.505569 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.505609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506349 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506439 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506489 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506519 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506612 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506633 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.506705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmsnp\" (UniqueName: \"kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.507017 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.507578 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.508174 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.508548 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.510723 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.511805 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.511969 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.512544 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.512875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.514135 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.518171 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.520568 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.526499 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9jf\" (UniqueName: \"kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf\") pod \"glance-default-external-api-0\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.540110 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmsnp\" (UniqueName: \"kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp\") pod \"glance-default-internal-api-0\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.603512 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.636244 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.856629 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8076e610-5752-4cdc-91d1-93989f4ff726" path="/var/lib/kubelet/pods/8076e610-5752-4cdc-91d1-93989f4ff726/volumes" Dec 05 22:17:45 crc kubenswrapper[4747]: I1205 22:17:45.858542 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23f8967-faa4-4680-871c-9360f9f65249" path="/var/lib/kubelet/pods/c23f8967-faa4-4680-871c-9360f9f65249/volumes" Dec 05 22:17:46 crc kubenswrapper[4747]: I1205 22:17:46.923246 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:17:47 crc kubenswrapper[4747]: I1205 22:17:47.211131 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerStarted","Data":"55b0ec33de6b756d63825aeddbe0ccb3e4c8bf016b840210a9d456324648f524"} Dec 05 22:17:47 crc kubenswrapper[4747]: I1205 22:17:47.714655 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:17:47 crc kubenswrapper[4747]: W1205 22:17:47.718034 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54381b7b_e3f0_4729_b0d6_5a6fb055e9bd.slice/crio-fab77c22f6475cabe7a8c0eee8fd75649891ccf3a0e39b506185865f5920cb44 WatchSource:0}: Error finding container fab77c22f6475cabe7a8c0eee8fd75649891ccf3a0e39b506185865f5920cb44: Status 404 returned error can't find the container with id fab77c22f6475cabe7a8c0eee8fd75649891ccf3a0e39b506185865f5920cb44 Dec 05 22:17:47 crc kubenswrapper[4747]: I1205 22:17:47.840612 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:17:47 crc kubenswrapper[4747]: E1205 22:17:47.840870 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:17:48 crc kubenswrapper[4747]: I1205 22:17:48.219403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerStarted","Data":"fab77c22f6475cabe7a8c0eee8fd75649891ccf3a0e39b506185865f5920cb44"} Dec 05 22:17:48 crc kubenswrapper[4747]: I1205 22:17:48.221349 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerStarted","Data":"773c06d7432bd5bb33c2966d73800183d2665781d36ab2f70880ea1c644c220b"} Dec 05 22:17:48 crc kubenswrapper[4747]: I1205 22:17:48.221386 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerStarted","Data":"f66a336b0a6ac3bad02b0a3fca4beb05022ad4585d93c67c49d4e5b2bd0df5e4"} Dec 05 22:17:48 crc kubenswrapper[4747]: I1205 22:17:48.245429 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.24540768 podStartE2EDuration="3.24540768s" podCreationTimestamp="2025-12-05 22:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:48.239132595 +0000 UTC m=+5738.706440083" watchObservedRunningTime="2025-12-05 22:17:48.24540768 +0000 UTC m=+5738.712715168" Dec 05 22:17:49 crc kubenswrapper[4747]: I1205 22:17:49.236345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerStarted","Data":"a995c30578be67c81ce2c11d12d51e2655a163a8034402e517cca58f37735866"} Dec 05 22:17:49 crc kubenswrapper[4747]: I1205 22:17:49.236788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerStarted","Data":"5f2f6e50caf6b0c63bfc6deb165a46f6301ee7209445d4f62bad5c2da65fa3bb"} Dec 05 22:17:49 crc kubenswrapper[4747]: I1205 22:17:49.276213 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.276181537 podStartE2EDuration="4.276181537s" podCreationTimestamp="2025-12-05 22:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:17:49.266343853 +0000 UTC m=+5739.733651361" watchObservedRunningTime="2025-12-05 22:17:49.276181537 +0000 UTC m=+5739.743489065" Dec 05 22:17:51 crc kubenswrapper[4747]: I1205 22:17:51.868873 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:17:51 crc kubenswrapper[4747]: I1205 22:17:51.977337 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:17:51 crc kubenswrapper[4747]: I1205 22:17:51.977651 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="dnsmasq-dns" containerID="cri-o://25e64646a8fe3ced8eae324d583f33ec41233ec1baa6ebedae388b8200d49183" gracePeriod=10 Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.271702 4747 generic.go:334] "Generic (PLEG): container finished" podID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerID="25e64646a8fe3ced8eae324d583f33ec41233ec1baa6ebedae388b8200d49183" exitCode=0 Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.271836 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" event={"ID":"84801e85-4aa3-4695-9cae-c28fcd1211fe","Type":"ContainerDied","Data":"25e64646a8fe3ced8eae324d583f33ec41233ec1baa6ebedae388b8200d49183"} Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.525951 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.550331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc\") pod \"84801e85-4aa3-4695-9cae-c28fcd1211fe\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.550519 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccqd\" (UniqueName: \"kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd\") pod \"84801e85-4aa3-4695-9cae-c28fcd1211fe\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.550620 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config\") pod \"84801e85-4aa3-4695-9cae-c28fcd1211fe\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.550657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb\") pod \"84801e85-4aa3-4695-9cae-c28fcd1211fe\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.550732 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb\") pod \"84801e85-4aa3-4695-9cae-c28fcd1211fe\" (UID: \"84801e85-4aa3-4695-9cae-c28fcd1211fe\") " Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.558281 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd" (OuterVolumeSpecName: "kube-api-access-5ccqd") pod "84801e85-4aa3-4695-9cae-c28fcd1211fe" (UID: "84801e85-4aa3-4695-9cae-c28fcd1211fe"). InnerVolumeSpecName "kube-api-access-5ccqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.608635 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84801e85-4aa3-4695-9cae-c28fcd1211fe" (UID: "84801e85-4aa3-4695-9cae-c28fcd1211fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.623829 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config" (OuterVolumeSpecName: "config") pod "84801e85-4aa3-4695-9cae-c28fcd1211fe" (UID: "84801e85-4aa3-4695-9cae-c28fcd1211fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.629604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84801e85-4aa3-4695-9cae-c28fcd1211fe" (UID: "84801e85-4aa3-4695-9cae-c28fcd1211fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.634147 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84801e85-4aa3-4695-9cae-c28fcd1211fe" (UID: "84801e85-4aa3-4695-9cae-c28fcd1211fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.652429 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccqd\" (UniqueName: \"kubernetes.io/projected/84801e85-4aa3-4695-9cae-c28fcd1211fe-kube-api-access-5ccqd\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.652474 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.652487 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.652498 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:52 crc kubenswrapper[4747]: I1205 22:17:52.652509 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84801e85-4aa3-4695-9cae-c28fcd1211fe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.286800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" event={"ID":"84801e85-4aa3-4695-9cae-c28fcd1211fe","Type":"ContainerDied","Data":"ba0870f350818fae72bc459b4d58f801c78161a278b6bb82b0851e9b4e1f7f57"} Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.286867 4747 scope.go:117] "RemoveContainer" containerID="25e64646a8fe3ced8eae324d583f33ec41233ec1baa6ebedae388b8200d49183" Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.287096 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6755cfdfb7-vn6cj" Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.342745 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.343543 4747 scope.go:117] "RemoveContainer" containerID="118e61ce0a05d5548b10e6cf0f35230dc06e30c61496e00a546f3cd0f5078614" Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.349943 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6755cfdfb7-vn6cj"] Dec 05 22:17:53 crc kubenswrapper[4747]: I1205 22:17:53.861921 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" path="/var/lib/kubelet/pods/84801e85-4aa3-4695-9cae-c28fcd1211fe/volumes" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.604437 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.604480 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.637794 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.637854 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.651314 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.674534 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.690172 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 22:17:55 crc kubenswrapper[4747]: I1205 22:17:55.702264 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 22:17:56 crc kubenswrapper[4747]: I1205 22:17:56.327547 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 22:17:56 crc kubenswrapper[4747]: I1205 22:17:56.327881 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:56 crc kubenswrapper[4747]: I1205 22:17:56.328006 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:56 crc kubenswrapper[4747]: I1205 22:17:56.328060 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 22:17:58 crc kubenswrapper[4747]: I1205 22:17:58.193832 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 22:17:58 crc kubenswrapper[4747]: I1205 22:17:58.267311 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 22:17:58 crc kubenswrapper[4747]: I1205 22:17:58.270550 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 22:17:58 crc kubenswrapper[4747]: I1205 22:17:58.298678 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 22:18:00 crc kubenswrapper[4747]: I1205 22:18:00.840807 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:18:00 crc kubenswrapper[4747]: E1205 22:18:00.842403 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.393825 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rfdwp"] Dec 05 22:18:06 crc kubenswrapper[4747]: E1205 22:18:06.394753 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="dnsmasq-dns" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.394769 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="dnsmasq-dns" Dec 05 22:18:06 crc kubenswrapper[4747]: E1205 22:18:06.394786 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="init" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.394794 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="init" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.395019 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="84801e85-4aa3-4695-9cae-c28fcd1211fe" containerName="dnsmasq-dns" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.395740 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.420359 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rfdwp"] Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.470613 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzjx\" (UniqueName: \"kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.470795 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.502706 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-874a-account-create-update-2pb5w"] Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.504048 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.506294 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.519907 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-874a-account-create-update-2pb5w"] Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.573191 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.573248 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2d8l\" (UniqueName: \"kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.573294 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.573407 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzjx\" (UniqueName: \"kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.574518 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.618791 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzjx\" (UniqueName: \"kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx\") pod \"placement-db-create-rfdwp\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.674916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.675105 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2d8l\" (UniqueName: \"kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.684177 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.711034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2d8l\" (UniqueName: \"kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l\") pod \"placement-874a-account-create-update-2pb5w\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.716424 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:06 crc kubenswrapper[4747]: I1205 22:18:06.820875 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.158176 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rfdwp"] Dec 05 22:18:07 crc kubenswrapper[4747]: W1205 22:18:07.158890 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd5a427_7846_4f98_bdef_c3b116a97cbb.slice/crio-a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929 WatchSource:0}: Error finding container a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929: Status 404 returned error can't find the container with id a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929 Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.314324 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-874a-account-create-update-2pb5w"] Dec 05 22:18:07 crc kubenswrapper[4747]: W1205 22:18:07.322611 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b08df84_d2c1_4977_af04_d21e169dfefc.slice/crio-50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c WatchSource:0}: Error finding container 50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c: Status 404 returned error can't find the container with id 50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.438414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rfdwp" event={"ID":"dbd5a427-7846-4f98-bdef-c3b116a97cbb","Type":"ContainerStarted","Data":"a294e56c662222030f2f29a7c59796902d82ef2a9c4a5fb15e7655f8dfbc548a"} Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.438456 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rfdwp" event={"ID":"dbd5a427-7846-4f98-bdef-c3b116a97cbb","Type":"ContainerStarted","Data":"a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929"} Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.442101 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-874a-account-create-update-2pb5w" event={"ID":"0b08df84-d2c1-4977-af04-d21e169dfefc","Type":"ContainerStarted","Data":"50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c"} Dec 05 22:18:07 crc kubenswrapper[4747]: I1205 22:18:07.453694 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-rfdwp" podStartSLOduration=1.4536714609999999 podStartE2EDuration="1.453671461s" podCreationTimestamp="2025-12-05 22:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:18:07.450360168 +0000 UTC m=+5757.917667656" watchObservedRunningTime="2025-12-05 22:18:07.453671461 +0000 UTC m=+5757.920978949" Dec 05 22:18:08 crc kubenswrapper[4747]: I1205 22:18:08.452909 4747 generic.go:334] "Generic (PLEG): container finished" podID="dbd5a427-7846-4f98-bdef-c3b116a97cbb" containerID="a294e56c662222030f2f29a7c59796902d82ef2a9c4a5fb15e7655f8dfbc548a" exitCode=0 Dec 05 22:18:08 crc kubenswrapper[4747]: I1205 22:18:08.452998 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rfdwp" event={"ID":"dbd5a427-7846-4f98-bdef-c3b116a97cbb","Type":"ContainerDied","Data":"a294e56c662222030f2f29a7c59796902d82ef2a9c4a5fb15e7655f8dfbc548a"} Dec 05 22:18:08 crc kubenswrapper[4747]: I1205 22:18:08.454948 4747 generic.go:334] "Generic (PLEG): container finished" podID="0b08df84-d2c1-4977-af04-d21e169dfefc" containerID="6093288aa5dedcd69c55f5aafb9a09e1af0eb72d68aede62c3ada1d066881e24" exitCode=0 Dec 05 22:18:08 crc kubenswrapper[4747]: I1205 22:18:08.454983 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-874a-account-create-update-2pb5w" event={"ID":"0b08df84-d2c1-4977-af04-d21e169dfefc","Type":"ContainerDied","Data":"6093288aa5dedcd69c55f5aafb9a09e1af0eb72d68aede62c3ada1d066881e24"} Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.883365 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.892535 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.937265 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2d8l\" (UniqueName: \"kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l\") pod \"0b08df84-d2c1-4977-af04-d21e169dfefc\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.937320 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzjx\" (UniqueName: \"kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx\") pod \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.937470 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts\") pod \"0b08df84-d2c1-4977-af04-d21e169dfefc\" (UID: \"0b08df84-d2c1-4977-af04-d21e169dfefc\") " Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.937507 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts\") pod \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\" (UID: \"dbd5a427-7846-4f98-bdef-c3b116a97cbb\") " Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.938942 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b08df84-d2c1-4977-af04-d21e169dfefc" (UID: "0b08df84-d2c1-4977-af04-d21e169dfefc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.939068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbd5a427-7846-4f98-bdef-c3b116a97cbb" (UID: "dbd5a427-7846-4f98-bdef-c3b116a97cbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.947774 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l" (OuterVolumeSpecName: "kube-api-access-t2d8l") pod "0b08df84-d2c1-4977-af04-d21e169dfefc" (UID: "0b08df84-d2c1-4977-af04-d21e169dfefc"). InnerVolumeSpecName "kube-api-access-t2d8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:18:09 crc kubenswrapper[4747]: I1205 22:18:09.952979 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx" (OuterVolumeSpecName: "kube-api-access-6fzjx") pod "dbd5a427-7846-4f98-bdef-c3b116a97cbb" (UID: "dbd5a427-7846-4f98-bdef-c3b116a97cbb"). InnerVolumeSpecName "kube-api-access-6fzjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.039895 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2d8l\" (UniqueName: \"kubernetes.io/projected/0b08df84-d2c1-4977-af04-d21e169dfefc-kube-api-access-t2d8l\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.039928 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzjx\" (UniqueName: \"kubernetes.io/projected/dbd5a427-7846-4f98-bdef-c3b116a97cbb-kube-api-access-6fzjx\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.039938 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b08df84-d2c1-4977-af04-d21e169dfefc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.039947 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd5a427-7846-4f98-bdef-c3b116a97cbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.475683 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rfdwp" event={"ID":"dbd5a427-7846-4f98-bdef-c3b116a97cbb","Type":"ContainerDied","Data":"a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929"} Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.475735 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92245c19734a34b97ae2af783ec1c1d4498b98898da02a4fd1060747b374929" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.475831 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rfdwp" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.482310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-874a-account-create-update-2pb5w" event={"ID":"0b08df84-d2c1-4977-af04-d21e169dfefc","Type":"ContainerDied","Data":"50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c"} Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.482356 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50068c0ea86b6495a06d3cf06611586535be07e72bcdd94ccc47c3dd370e659c" Dec 05 22:18:10 crc kubenswrapper[4747]: I1205 22:18:10.482421 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-874a-account-create-update-2pb5w" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.773196 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:18:11 crc kubenswrapper[4747]: E1205 22:18:11.773729 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd5a427-7846-4f98-bdef-c3b116a97cbb" containerName="mariadb-database-create" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.773748 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd5a427-7846-4f98-bdef-c3b116a97cbb" containerName="mariadb-database-create" Dec 05 22:18:11 crc kubenswrapper[4747]: E1205 22:18:11.773759 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b08df84-d2c1-4977-af04-d21e169dfefc" containerName="mariadb-account-create-update" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.773768 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b08df84-d2c1-4977-af04-d21e169dfefc" containerName="mariadb-account-create-update" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.773998 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b08df84-d2c1-4977-af04-d21e169dfefc" containerName="mariadb-account-create-update" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.774030 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd5a427-7846-4f98-bdef-c3b116a97cbb" containerName="mariadb-database-create" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.775231 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.797655 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.849814 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-srhb7"] Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.850844 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.856884 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.857050 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f6qs4" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.857161 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.868317 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srhb7"] Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.878740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.878857 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.878898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.879000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggcg\" (UniqueName: \"kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.879301 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981203 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981276 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981353 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981398 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981427 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggcg\" (UniqueName: \"kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981459 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.981514 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbcfg\" (UniqueName: \"kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.982417 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.982526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.983119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:11 crc kubenswrapper[4747]: I1205 22:18:11.983210 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.000092 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggcg\" (UniqueName: \"kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg\") pod \"dnsmasq-dns-765569f867-gzd9x\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.083367 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.083471 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.083538 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.083557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.083658 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbcfg\" (UniqueName: \"kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.085163 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.088052 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.088238 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.090897 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.092906 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.110114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbcfg\" (UniqueName: \"kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg\") pod \"placement-db-sync-srhb7\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.174787 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.622122 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:18:12 crc kubenswrapper[4747]: W1205 22:18:12.625014 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod072d8f67_dbec_4764_8817_7948bb98f600.slice/crio-becf81963b7c57ef7138be5a658257e225ecf799e3a8b22c31c074750c9f8cf2 WatchSource:0}: Error finding container becf81963b7c57ef7138be5a658257e225ecf799e3a8b22c31c074750c9f8cf2: Status 404 returned error can't find the container with id becf81963b7c57ef7138be5a658257e225ecf799e3a8b22c31c074750c9f8cf2 Dec 05 22:18:12 crc kubenswrapper[4747]: W1205 22:18:12.722647 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3b81b11_0fb6_48aa_ad62_8129aeedbd7d.slice/crio-76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778 WatchSource:0}: Error finding container 76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778: Status 404 returned error can't find the container with id 76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778 Dec 05 22:18:12 crc kubenswrapper[4747]: I1205 22:18:12.723207 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srhb7"] Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.506620 4747 generic.go:334] "Generic (PLEG): container finished" podID="072d8f67-dbec-4764-8817-7948bb98f600" containerID="d6f0b8fdfb1f11e78b7cb7cf26ebc2b9578d47f20d98361caf5058f554f30cf4" exitCode=0 Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.506796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765569f867-gzd9x" event={"ID":"072d8f67-dbec-4764-8817-7948bb98f600","Type":"ContainerDied","Data":"d6f0b8fdfb1f11e78b7cb7cf26ebc2b9578d47f20d98361caf5058f554f30cf4"} Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.506965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765569f867-gzd9x" event={"ID":"072d8f67-dbec-4764-8817-7948bb98f600","Type":"ContainerStarted","Data":"becf81963b7c57ef7138be5a658257e225ecf799e3a8b22c31c074750c9f8cf2"} Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.508788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srhb7" event={"ID":"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d","Type":"ContainerStarted","Data":"d2aef1d5fbaf49a6cfb48c9312ef3a14a04d8bac9ac9e5add62ab3db2a2f104d"} Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.508809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srhb7" event={"ID":"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d","Type":"ContainerStarted","Data":"76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778"} Dec 05 22:18:13 crc kubenswrapper[4747]: I1205 22:18:13.574717 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-srhb7" podStartSLOduration=2.574702565 podStartE2EDuration="2.574702565s" podCreationTimestamp="2025-12-05 22:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:18:13.572767477 +0000 UTC m=+5764.040074965" watchObservedRunningTime="2025-12-05 22:18:13.574702565 +0000 UTC m=+5764.042010053" Dec 05 22:18:14 crc kubenswrapper[4747]: I1205 22:18:14.520888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765569f867-gzd9x" event={"ID":"072d8f67-dbec-4764-8817-7948bb98f600","Type":"ContainerStarted","Data":"ebdf0dc680dd9ce06ed465a2f78ea69d3861f395c0d634104d76f1da822c9aa7"} Dec 05 22:18:14 crc kubenswrapper[4747]: I1205 22:18:14.520995 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:14 crc kubenswrapper[4747]: I1205 22:18:14.522688 4747 generic.go:334] "Generic (PLEG): container finished" podID="c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" containerID="d2aef1d5fbaf49a6cfb48c9312ef3a14a04d8bac9ac9e5add62ab3db2a2f104d" exitCode=0 Dec 05 22:18:14 crc kubenswrapper[4747]: I1205 22:18:14.522765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srhb7" event={"ID":"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d","Type":"ContainerDied","Data":"d2aef1d5fbaf49a6cfb48c9312ef3a14a04d8bac9ac9e5add62ab3db2a2f104d"} Dec 05 22:18:14 crc kubenswrapper[4747]: I1205 22:18:14.540660 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-765569f867-gzd9x" podStartSLOduration=3.5406393449999998 podStartE2EDuration="3.540639345s" podCreationTimestamp="2025-12-05 22:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:18:14.538791919 +0000 UTC m=+5765.006099417" watchObservedRunningTime="2025-12-05 22:18:14.540639345 +0000 UTC m=+5765.007946833" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.840189 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:18:15 crc kubenswrapper[4747]: E1205 22:18:15.840962 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.933802 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.962047 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs\") pod \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.962769 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs" (OuterVolumeSpecName: "logs") pod "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" (UID: "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.972805 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts\") pod \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.972957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbcfg\" (UniqueName: \"kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg\") pod \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.973026 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle\") pod \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.973052 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data\") pod \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\" (UID: \"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d\") " Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.973765 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.979047 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg" (OuterVolumeSpecName: "kube-api-access-lbcfg") pod "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" (UID: "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d"). InnerVolumeSpecName "kube-api-access-lbcfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:18:15 crc kubenswrapper[4747]: I1205 22:18:15.984739 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts" (OuterVolumeSpecName: "scripts") pod "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" (UID: "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.003766 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" (UID: "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.015274 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data" (OuterVolumeSpecName: "config-data") pod "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" (UID: "c3b81b11-0fb6-48aa-ad62-8129aeedbd7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.075752 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbcfg\" (UniqueName: \"kubernetes.io/projected/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-kube-api-access-lbcfg\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.075802 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.075819 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.075833 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.538956 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srhb7" event={"ID":"c3b81b11-0fb6-48aa-ad62-8129aeedbd7d","Type":"ContainerDied","Data":"76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778"} Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.538996 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c32f7fde27d3dc67a5e1d623198e4ae355f41505a03d6b7c029f0fadc8b778" Dec 05 22:18:16 crc kubenswrapper[4747]: I1205 22:18:16.539009 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srhb7" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.034170 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b7bf9cb-q8n7g"] Dec 05 22:18:17 crc kubenswrapper[4747]: E1205 22:18:17.035681 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" containerName="placement-db-sync" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.035710 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" containerName="placement-db-sync" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.035938 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" containerName="placement-db-sync" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.037111 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.045856 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.046141 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f6qs4" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.049369 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.049805 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.050007 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100192 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-config-data\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100255 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-public-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100309 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-internal-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-combined-ca-bundle\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100661 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29277589-fa87-483b-9c76-03ca12b29a20-logs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100703 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fktx\" (UniqueName: \"kubernetes.io/projected/29277589-fa87-483b-9c76-03ca12b29a20-kube-api-access-2fktx\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100743 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-scripts\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.100929 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7bf9cb-q8n7g"] Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202668 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-public-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-internal-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-combined-ca-bundle\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202887 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29277589-fa87-483b-9c76-03ca12b29a20-logs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202919 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fktx\" (UniqueName: \"kubernetes.io/projected/29277589-fa87-483b-9c76-03ca12b29a20-kube-api-access-2fktx\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-scripts\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.202985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-config-data\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.204793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29277589-fa87-483b-9c76-03ca12b29a20-logs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.208744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-public-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.208961 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-scripts\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.209486 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-config-data\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.210712 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-combined-ca-bundle\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.216320 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29277589-fa87-483b-9c76-03ca12b29a20-internal-tls-certs\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.225776 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fktx\" (UniqueName: \"kubernetes.io/projected/29277589-fa87-483b-9c76-03ca12b29a20-kube-api-access-2fktx\") pod \"placement-b7bf9cb-q8n7g\" (UID: \"29277589-fa87-483b-9c76-03ca12b29a20\") " pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.361902 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:17 crc kubenswrapper[4747]: I1205 22:18:17.928214 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7bf9cb-q8n7g"] Dec 05 22:18:17 crc kubenswrapper[4747]: W1205 22:18:17.932030 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29277589_fa87_483b_9c76_03ca12b29a20.slice/crio-8f0c6de2ca74cef0eb4cf13b16e27ddd1e07b208c0cade73cd3620dfd88c1e57 WatchSource:0}: Error finding container 8f0c6de2ca74cef0eb4cf13b16e27ddd1e07b208c0cade73cd3620dfd88c1e57: Status 404 returned error can't find the container with id 8f0c6de2ca74cef0eb4cf13b16e27ddd1e07b208c0cade73cd3620dfd88c1e57 Dec 05 22:18:18 crc kubenswrapper[4747]: I1205 22:18:18.572264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7bf9cb-q8n7g" event={"ID":"29277589-fa87-483b-9c76-03ca12b29a20","Type":"ContainerStarted","Data":"340fb0f20aeafad23277dfe2347bb3fe55afbe344712714ea6751912df9d273f"} Dec 05 22:18:18 crc kubenswrapper[4747]: I1205 22:18:18.572605 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7bf9cb-q8n7g" event={"ID":"29277589-fa87-483b-9c76-03ca12b29a20","Type":"ContainerStarted","Data":"0d82a16e19a66f1a8b5e2f3602712c182a3534cccdd1a5b880a7c63735f8f7a0"} Dec 05 22:18:18 crc kubenswrapper[4747]: I1205 22:18:18.572625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7bf9cb-q8n7g" event={"ID":"29277589-fa87-483b-9c76-03ca12b29a20","Type":"ContainerStarted","Data":"8f0c6de2ca74cef0eb4cf13b16e27ddd1e07b208c0cade73cd3620dfd88c1e57"} Dec 05 22:18:18 crc kubenswrapper[4747]: I1205 22:18:18.572782 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:18 crc kubenswrapper[4747]: I1205 22:18:18.597429 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b7bf9cb-q8n7g" podStartSLOduration=1.597411578 podStartE2EDuration="1.597411578s" podCreationTimestamp="2025-12-05 22:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:18:18.59303922 +0000 UTC m=+5769.060346728" watchObservedRunningTime="2025-12-05 22:18:18.597411578 +0000 UTC m=+5769.064719076" Dec 05 22:18:19 crc kubenswrapper[4747]: I1205 22:18:19.582963 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.095209 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.216903 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.217115 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="dnsmasq-dns" containerID="cri-o://400d88ecfbe4118ef55474a7cf555e41df0361e6746fdf68483077a8b777f618" gracePeriod=10 Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.608133 4747 generic.go:334] "Generic (PLEG): container finished" podID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerID="400d88ecfbe4118ef55474a7cf555e41df0361e6746fdf68483077a8b777f618" exitCode=0 Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.608206 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" event={"ID":"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c","Type":"ContainerDied","Data":"400d88ecfbe4118ef55474a7cf555e41df0361e6746fdf68483077a8b777f618"} Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.752180 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.811907 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb\") pod \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.811996 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config\") pod \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.812077 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8hzn\" (UniqueName: \"kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn\") pod \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.812152 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb\") pod \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.812215 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc\") pod \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\" (UID: \"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c\") " Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.818798 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn" (OuterVolumeSpecName: "kube-api-access-f8hzn") pod "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" (UID: "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c"). InnerVolumeSpecName "kube-api-access-f8hzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.866784 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config" (OuterVolumeSpecName: "config") pod "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" (UID: "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.873036 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" (UID: "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.889387 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" (UID: "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.904133 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" (UID: "bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.914128 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.914163 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.914178 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8hzn\" (UniqueName: \"kubernetes.io/projected/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-kube-api-access-f8hzn\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.914190 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:22 crc kubenswrapper[4747]: I1205 22:18:22.914201 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.618024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" event={"ID":"bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c","Type":"ContainerDied","Data":"c4e26ecadad5c679806b8ca3a00652b43554e52c65ab37747a7966bb21465f12"} Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.618321 4747 scope.go:117] "RemoveContainer" containerID="400d88ecfbe4118ef55474a7cf555e41df0361e6746fdf68483077a8b777f618" Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.618084 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5bc57f77-b9vqs" Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.652571 4747 scope.go:117] "RemoveContainer" containerID="bd05469d2afed8d3ef952a3b52c80714582b0c03a54cf1faddcd868fa722076f" Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.657883 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.682408 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f5bc57f77-b9vqs"] Dec 05 22:18:23 crc kubenswrapper[4747]: I1205 22:18:23.852383 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" path="/var/lib/kubelet/pods/bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c/volumes" Dec 05 22:18:28 crc kubenswrapper[4747]: I1205 22:18:28.841091 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:18:28 crc kubenswrapper[4747]: E1205 22:18:28.842342 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:18:42 crc kubenswrapper[4747]: I1205 22:18:42.839841 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:18:42 crc kubenswrapper[4747]: E1205 22:18:42.842741 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:18:48 crc kubenswrapper[4747]: I1205 22:18:48.408396 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:48 crc kubenswrapper[4747]: I1205 22:18:48.410506 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7bf9cb-q8n7g" Dec 05 22:18:54 crc kubenswrapper[4747]: I1205 22:18:54.840203 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:18:54 crc kubenswrapper[4747]: E1205 22:18:54.841387 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:19:08 crc kubenswrapper[4747]: I1205 22:19:08.840536 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:19:08 crc kubenswrapper[4747]: E1205 22:19:08.843455 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.012337 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-v48zv"] Dec 05 22:19:12 crc kubenswrapper[4747]: E1205 22:19:12.013434 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="dnsmasq-dns" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.013453 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="dnsmasq-dns" Dec 05 22:19:12 crc kubenswrapper[4747]: E1205 22:19:12.013477 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="init" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.013485 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="init" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.013679 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9279bb-2f0c-48ea-a3dc-4f698b4b9a0c" containerName="dnsmasq-dns" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.019410 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.037453 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v48zv"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.108502 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-s8w5s"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.109789 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.122801 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s8w5s"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.204819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sfq5\" (UniqueName: \"kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.205045 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.205730 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-31a9-account-create-update-dwhfh"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.206912 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.209506 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.217725 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-31a9-account-create-update-dwhfh"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.302421 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kffrg"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.303454 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.306704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sfq5\" (UniqueName: \"kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.306826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.306863 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5xf\" (UniqueName: \"kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.306888 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5qv\" (UniqueName: \"kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.307026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.307063 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.307855 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.311225 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kffrg"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.332399 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sfq5\" (UniqueName: \"kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5\") pod \"nova-api-db-create-v48zv\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.338043 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.404334 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-edba-account-create-update-v69dv"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.418465 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.421446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.421896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.423711 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.424804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75nf\" (UniqueName: \"kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.425110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.425151 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5qv\" (UniqueName: \"kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.425178 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5xf\" (UniqueName: \"kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.425229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.428065 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.449320 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-edba-account-create-update-v69dv"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.464327 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5xf\" (UniqueName: \"kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf\") pod \"nova-cell0-db-create-s8w5s\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.464624 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5qv\" (UniqueName: \"kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv\") pod \"nova-api-31a9-account-create-update-dwhfh\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.523969 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.526655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.526759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.526811 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vv9\" (UniqueName: \"kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.526879 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75nf\" (UniqueName: \"kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.527572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.544859 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75nf\" (UniqueName: \"kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf\") pod \"nova-cell1-db-create-kffrg\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.614941 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-15d6-account-create-update-9wmxq"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.616537 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.619866 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.625474 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-15d6-account-create-update-9wmxq"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.625747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.627881 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.627941 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vv9\" (UniqueName: \"kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.628939 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.659346 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vv9\" (UniqueName: \"kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9\") pod \"nova-cell0-edba-account-create-update-v69dv\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.729325 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.729377 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wnx\" (UniqueName: \"kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.729913 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.838868 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.838961 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wnx\" (UniqueName: \"kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.839836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.859919 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v48zv"] Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.864633 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wnx\" (UniqueName: \"kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx\") pod \"nova-cell1-15d6-account-create-update-9wmxq\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:12 crc kubenswrapper[4747]: W1205 22:19:12.911065 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadf83e06_10f7_44aa_b33b_36c305016e03.slice/crio-2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d WatchSource:0}: Error finding container 2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d: Status 404 returned error can't find the container with id 2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.917651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:12 crc kubenswrapper[4747]: I1205 22:19:12.941840 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.073755 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-31a9-account-create-update-dwhfh"] Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.129813 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kffrg"] Dec 05 22:19:13 crc kubenswrapper[4747]: W1205 22:19:13.137494 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2309ee7_7430_4873_b34e_0428cb08f00c.slice/crio-6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90 WatchSource:0}: Error finding container 6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90: Status 404 returned error can't find the container with id 6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90 Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.147901 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v48zv" event={"ID":"adf83e06-10f7-44aa-b33b-36c305016e03","Type":"ContainerStarted","Data":"2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d"} Dec 05 22:19:13 crc kubenswrapper[4747]: W1205 22:19:13.172570 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24104846_9490_409f_b1be_cb3b84eb2d11.slice/crio-a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3 WatchSource:0}: Error finding container a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3: Status 404 returned error can't find the container with id a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3 Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.363921 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s8w5s"] Dec 05 22:19:13 crc kubenswrapper[4747]: W1205 22:19:13.393722 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d2a3210_8646_450a_9f16_cb6471ca857f.slice/crio-56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660 WatchSource:0}: Error finding container 56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660: Status 404 returned error can't find the container with id 56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660 Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.464633 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-15d6-account-create-update-9wmxq"] Dec 05 22:19:13 crc kubenswrapper[4747]: W1205 22:19:13.559826 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d77f7ec_5822_4fd0_b935_09f0f77ce1ee.slice/crio-f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4 WatchSource:0}: Error finding container f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4: Status 404 returned error can't find the container with id f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4 Dec 05 22:19:13 crc kubenswrapper[4747]: I1205 22:19:13.594107 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-edba-account-create-update-v69dv"] Dec 05 22:19:13 crc kubenswrapper[4747]: W1205 22:19:13.628170 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb1de861_ab51_437a_a5f7_a6d5454f0360.slice/crio-0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249 WatchSource:0}: Error finding container 0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249: Status 404 returned error can't find the container with id 0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.159012 4747 generic.go:334] "Generic (PLEG): container finished" podID="f2309ee7-7430-4873-b34e-0428cb08f00c" containerID="0b7baed7145b33c3ef0a520cea9743fded9c1145020746ee40fcdd3cf12db16c" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.159088 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31a9-account-create-update-dwhfh" event={"ID":"f2309ee7-7430-4873-b34e-0428cb08f00c","Type":"ContainerDied","Data":"0b7baed7145b33c3ef0a520cea9743fded9c1145020746ee40fcdd3cf12db16c"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.159151 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31a9-account-create-update-dwhfh" event={"ID":"f2309ee7-7430-4873-b34e-0428cb08f00c","Type":"ContainerStarted","Data":"6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.162140 4747 generic.go:334] "Generic (PLEG): container finished" podID="6d2a3210-8646-450a-9f16-cb6471ca857f" containerID="d4325bffe5e4c1c2f3c0f79ef3441e66c5359506f83cb0cca4da869ebcb0621a" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.162243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s8w5s" event={"ID":"6d2a3210-8646-450a-9f16-cb6471ca857f","Type":"ContainerDied","Data":"d4325bffe5e4c1c2f3c0f79ef3441e66c5359506f83cb0cca4da869ebcb0621a"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.162277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s8w5s" event={"ID":"6d2a3210-8646-450a-9f16-cb6471ca857f","Type":"ContainerStarted","Data":"56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.164504 4747 generic.go:334] "Generic (PLEG): container finished" podID="adf83e06-10f7-44aa-b33b-36c305016e03" containerID="367a0c30a026f2df077f5996790acc7826c2158a42090f03d5d853e2af58d572" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.164617 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v48zv" event={"ID":"adf83e06-10f7-44aa-b33b-36c305016e03","Type":"ContainerDied","Data":"367a0c30a026f2df077f5996790acc7826c2158a42090f03d5d853e2af58d572"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.166828 4747 generic.go:334] "Generic (PLEG): container finished" podID="8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" containerID="186bf1dbc0ef085f067b7f9bd83bab2c3ecaaee73075b570ee00ffc38865ad6c" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.166930 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" event={"ID":"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee","Type":"ContainerDied","Data":"186bf1dbc0ef085f067b7f9bd83bab2c3ecaaee73075b570ee00ffc38865ad6c"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.166969 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" event={"ID":"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee","Type":"ContainerStarted","Data":"f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.169063 4747 generic.go:334] "Generic (PLEG): container finished" podID="bb1de861-ab51-437a-a5f7-a6d5454f0360" containerID="d9c78024e04894812a9542f17ff9904234e2f413595c7030ffdfdaaf47107b9e" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.169113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-edba-account-create-update-v69dv" event={"ID":"bb1de861-ab51-437a-a5f7-a6d5454f0360","Type":"ContainerDied","Data":"d9c78024e04894812a9542f17ff9904234e2f413595c7030ffdfdaaf47107b9e"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.169133 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-edba-account-create-update-v69dv" event={"ID":"bb1de861-ab51-437a-a5f7-a6d5454f0360","Type":"ContainerStarted","Data":"0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.171314 4747 generic.go:334] "Generic (PLEG): container finished" podID="24104846-9490-409f-b1be-cb3b84eb2d11" containerID="ae7f4b9e9e876686b8ae22e6ee55cb869b9a39e8ced944c20a6a58eb168c7d14" exitCode=0 Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.171341 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kffrg" event={"ID":"24104846-9490-409f-b1be-cb3b84eb2d11","Type":"ContainerDied","Data":"ae7f4b9e9e876686b8ae22e6ee55cb869b9a39e8ced944c20a6a58eb168c7d14"} Dec 05 22:19:14 crc kubenswrapper[4747]: I1205 22:19:14.171357 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kffrg" event={"ID":"24104846-9490-409f-b1be-cb3b84eb2d11","Type":"ContainerStarted","Data":"a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3"} Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.652177 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.805081 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wnx\" (UniqueName: \"kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx\") pod \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.805163 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts\") pod \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\" (UID: \"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee\") " Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.806327 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" (UID: "8d77f7ec-5822-4fd0-b935-09f0f77ce1ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.810652 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx" (OuterVolumeSpecName: "kube-api-access-k5wnx") pod "8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" (UID: "8d77f7ec-5822-4fd0-b935-09f0f77ce1ee"). InnerVolumeSpecName "kube-api-access-k5wnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.851045 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.862722 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.907771 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5wnx\" (UniqueName: \"kubernetes.io/projected/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-kube-api-access-k5wnx\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.907798 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.916736 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.925885 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:15 crc kubenswrapper[4747]: I1205 22:19:15.935861 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.008964 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts\") pod \"24104846-9490-409f-b1be-cb3b84eb2d11\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009067 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v5xf\" (UniqueName: \"kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf\") pod \"6d2a3210-8646-450a-9f16-cb6471ca857f\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009134 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts\") pod \"f2309ee7-7430-4873-b34e-0428cb08f00c\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009168 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts\") pod \"6d2a3210-8646-450a-9f16-cb6471ca857f\" (UID: \"6d2a3210-8646-450a-9f16-cb6471ca857f\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009222 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc5qv\" (UniqueName: \"kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv\") pod \"f2309ee7-7430-4873-b34e-0428cb08f00c\" (UID: \"f2309ee7-7430-4873-b34e-0428cb08f00c\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6vv9\" (UniqueName: \"kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9\") pod \"bb1de861-ab51-437a-a5f7-a6d5454f0360\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009418 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts\") pod \"bb1de861-ab51-437a-a5f7-a6d5454f0360\" (UID: \"bb1de861-ab51-437a-a5f7-a6d5454f0360\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.009445 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v75nf\" (UniqueName: \"kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf\") pod \"24104846-9490-409f-b1be-cb3b84eb2d11\" (UID: \"24104846-9490-409f-b1be-cb3b84eb2d11\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.010914 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb1de861-ab51-437a-a5f7-a6d5454f0360" (UID: "bb1de861-ab51-437a-a5f7-a6d5454f0360"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.011019 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d2a3210-8646-450a-9f16-cb6471ca857f" (UID: "6d2a3210-8646-450a-9f16-cb6471ca857f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.011069 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24104846-9490-409f-b1be-cb3b84eb2d11" (UID: "24104846-9490-409f-b1be-cb3b84eb2d11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.011314 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2309ee7-7430-4873-b34e-0428cb08f00c" (UID: "f2309ee7-7430-4873-b34e-0428cb08f00c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.013936 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf" (OuterVolumeSpecName: "kube-api-access-v75nf") pod "24104846-9490-409f-b1be-cb3b84eb2d11" (UID: "24104846-9490-409f-b1be-cb3b84eb2d11"). InnerVolumeSpecName "kube-api-access-v75nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.013989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9" (OuterVolumeSpecName: "kube-api-access-z6vv9") pod "bb1de861-ab51-437a-a5f7-a6d5454f0360" (UID: "bb1de861-ab51-437a-a5f7-a6d5454f0360"). InnerVolumeSpecName "kube-api-access-z6vv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.014347 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf" (OuterVolumeSpecName: "kube-api-access-2v5xf") pod "6d2a3210-8646-450a-9f16-cb6471ca857f" (UID: "6d2a3210-8646-450a-9f16-cb6471ca857f"). InnerVolumeSpecName "kube-api-access-2v5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.014879 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv" (OuterVolumeSpecName: "kube-api-access-bc5qv") pod "f2309ee7-7430-4873-b34e-0428cb08f00c" (UID: "f2309ee7-7430-4873-b34e-0428cb08f00c"). InnerVolumeSpecName "kube-api-access-bc5qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.111373 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sfq5\" (UniqueName: \"kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5\") pod \"adf83e06-10f7-44aa-b33b-36c305016e03\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.111568 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts\") pod \"adf83e06-10f7-44aa-b33b-36c305016e03\" (UID: \"adf83e06-10f7-44aa-b33b-36c305016e03\") " Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112125 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb1de861-ab51-437a-a5f7-a6d5454f0360-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112155 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v75nf\" (UniqueName: \"kubernetes.io/projected/24104846-9490-409f-b1be-cb3b84eb2d11-kube-api-access-v75nf\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112169 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24104846-9490-409f-b1be-cb3b84eb2d11-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112181 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v5xf\" (UniqueName: \"kubernetes.io/projected/6d2a3210-8646-450a-9f16-cb6471ca857f-kube-api-access-2v5xf\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112196 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2309ee7-7430-4873-b34e-0428cb08f00c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112208 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2a3210-8646-450a-9f16-cb6471ca857f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112219 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc5qv\" (UniqueName: \"kubernetes.io/projected/f2309ee7-7430-4873-b34e-0428cb08f00c-kube-api-access-bc5qv\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112211 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adf83e06-10f7-44aa-b33b-36c305016e03" (UID: "adf83e06-10f7-44aa-b33b-36c305016e03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.112229 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6vv9\" (UniqueName: \"kubernetes.io/projected/bb1de861-ab51-437a-a5f7-a6d5454f0360-kube-api-access-z6vv9\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.117865 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5" (OuterVolumeSpecName: "kube-api-access-9sfq5") pod "adf83e06-10f7-44aa-b33b-36c305016e03" (UID: "adf83e06-10f7-44aa-b33b-36c305016e03"). InnerVolumeSpecName "kube-api-access-9sfq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.200676 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s8w5s" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.200693 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s8w5s" event={"ID":"6d2a3210-8646-450a-9f16-cb6471ca857f","Type":"ContainerDied","Data":"56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.200764 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56689888467a7ca3c427bfe80c640c9e02950d495f0ce1148ed9b2d4d14b7660" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.203104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v48zv" event={"ID":"adf83e06-10f7-44aa-b33b-36c305016e03","Type":"ContainerDied","Data":"2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.203128 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v48zv" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.203155 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c71b7bb2c91301cdc382c8800d445ec52a0fc8c15e91a8541fac07ed697360d" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.206766 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.206800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15d6-account-create-update-9wmxq" event={"ID":"8d77f7ec-5822-4fd0-b935-09f0f77ce1ee","Type":"ContainerDied","Data":"f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.206854 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e59f76f04c05bba68a53bd6ed7e021a6507b9c39f7429eb23aa9506b62a3a4" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.209296 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-edba-account-create-update-v69dv" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.209286 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-edba-account-create-update-v69dv" event={"ID":"bb1de861-ab51-437a-a5f7-a6d5454f0360","Type":"ContainerDied","Data":"0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.209960 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a42415c63808a5ac72c3e3defb6280656b20798082607bdd68e612ec5aa7249" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.214496 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kffrg" event={"ID":"24104846-9490-409f-b1be-cb3b84eb2d11","Type":"ContainerDied","Data":"a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.214562 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6cb1efc1790683c817e27033b73add5d89caa59e6fb6861c7530a3aca24e4d3" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.214514 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf83e06-10f7-44aa-b33b-36c305016e03-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.214661 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sfq5\" (UniqueName: \"kubernetes.io/projected/adf83e06-10f7-44aa-b33b-36c305016e03-kube-api-access-9sfq5\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.214749 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kffrg" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.219108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31a9-account-create-update-dwhfh" event={"ID":"f2309ee7-7430-4873-b34e-0428cb08f00c","Type":"ContainerDied","Data":"6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90"} Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.219163 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7d7f54aef4a689f5cbe341ca11791da2cdc42a82a0a7e524e2010b28bbae90" Dec 05 22:19:16 crc kubenswrapper[4747]: I1205 22:19:16.219233 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31a9-account-create-update-dwhfh" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.760754 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-trx7x"] Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762074 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24104846-9490-409f-b1be-cb3b84eb2d11" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762143 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="24104846-9490-409f-b1be-cb3b84eb2d11" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762256 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2a3210-8646-450a-9f16-cb6471ca857f" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762307 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2a3210-8646-450a-9f16-cb6471ca857f" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762362 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf83e06-10f7-44aa-b33b-36c305016e03" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762410 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf83e06-10f7-44aa-b33b-36c305016e03" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762464 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2309ee7-7430-4873-b34e-0428cb08f00c" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762511 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2309ee7-7430-4873-b34e-0428cb08f00c" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762621 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1de861-ab51-437a-a5f7-a6d5454f0360" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762741 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1de861-ab51-437a-a5f7-a6d5454f0360" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: E1205 22:19:17.762798 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.762845 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763066 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf83e06-10f7-44aa-b33b-36c305016e03" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763130 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2a3210-8646-450a-9f16-cb6471ca857f" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763192 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1de861-ab51-437a-a5f7-a6d5454f0360" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763244 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2309ee7-7430-4873-b34e-0428cb08f00c" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763306 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="24104846-9490-409f-b1be-cb3b84eb2d11" containerName="mariadb-database-create" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.763360 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" containerName="mariadb-account-create-update" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.764039 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.766281 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.766976 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.767300 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-44ngb" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.780448 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-trx7x"] Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.848636 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg45r\" (UniqueName: \"kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.848729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.849054 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.849175 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.950685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg45r\" (UniqueName: \"kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.950791 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.950933 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.950977 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.955111 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.955380 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.955950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:17 crc kubenswrapper[4747]: I1205 22:19:17.968780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg45r\" (UniqueName: \"kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r\") pod \"nova-cell0-conductor-db-sync-trx7x\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:18 crc kubenswrapper[4747]: I1205 22:19:18.086871 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:18 crc kubenswrapper[4747]: I1205 22:19:18.591748 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-trx7x"] Dec 05 22:19:18 crc kubenswrapper[4747]: W1205 22:19:18.609492 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0726b010_6142_410a_a353_d9c89ea8f724.slice/crio-ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9 WatchSource:0}: Error finding container ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9: Status 404 returned error can't find the container with id ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9 Dec 05 22:19:19 crc kubenswrapper[4747]: I1205 22:19:19.258046 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-trx7x" event={"ID":"0726b010-6142-410a-a353-d9c89ea8f724","Type":"ContainerStarted","Data":"4b2064791b36d4533cf9e76e370f1a3b342bfe727189bb06b13926554f127987"} Dec 05 22:19:19 crc kubenswrapper[4747]: I1205 22:19:19.258371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-trx7x" event={"ID":"0726b010-6142-410a-a353-d9c89ea8f724","Type":"ContainerStarted","Data":"ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9"} Dec 05 22:19:19 crc kubenswrapper[4747]: I1205 22:19:19.279511 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-trx7x" podStartSLOduration=2.279479688 podStartE2EDuration="2.279479688s" podCreationTimestamp="2025-12-05 22:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:19.276085524 +0000 UTC m=+5829.743393032" watchObservedRunningTime="2025-12-05 22:19:19.279479688 +0000 UTC m=+5829.746787176" Dec 05 22:19:21 crc kubenswrapper[4747]: I1205 22:19:21.840712 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:19:21 crc kubenswrapper[4747]: E1205 22:19:21.841691 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:19:24 crc kubenswrapper[4747]: I1205 22:19:24.353318 4747 generic.go:334] "Generic (PLEG): container finished" podID="0726b010-6142-410a-a353-d9c89ea8f724" containerID="4b2064791b36d4533cf9e76e370f1a3b342bfe727189bb06b13926554f127987" exitCode=0 Dec 05 22:19:24 crc kubenswrapper[4747]: I1205 22:19:24.353383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-trx7x" event={"ID":"0726b010-6142-410a-a353-d9c89ea8f724","Type":"ContainerDied","Data":"4b2064791b36d4533cf9e76e370f1a3b342bfe727189bb06b13926554f127987"} Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.720815 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.732127 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg45r\" (UniqueName: \"kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r\") pod \"0726b010-6142-410a-a353-d9c89ea8f724\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.732286 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle\") pod \"0726b010-6142-410a-a353-d9c89ea8f724\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.732342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts\") pod \"0726b010-6142-410a-a353-d9c89ea8f724\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.732393 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data\") pod \"0726b010-6142-410a-a353-d9c89ea8f724\" (UID: \"0726b010-6142-410a-a353-d9c89ea8f724\") " Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.740211 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts" (OuterVolumeSpecName: "scripts") pod "0726b010-6142-410a-a353-d9c89ea8f724" (UID: "0726b010-6142-410a-a353-d9c89ea8f724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.745378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r" (OuterVolumeSpecName: "kube-api-access-dg45r") pod "0726b010-6142-410a-a353-d9c89ea8f724" (UID: "0726b010-6142-410a-a353-d9c89ea8f724"). InnerVolumeSpecName "kube-api-access-dg45r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.778509 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data" (OuterVolumeSpecName: "config-data") pod "0726b010-6142-410a-a353-d9c89ea8f724" (UID: "0726b010-6142-410a-a353-d9c89ea8f724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.801735 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0726b010-6142-410a-a353-d9c89ea8f724" (UID: "0726b010-6142-410a-a353-d9c89ea8f724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.834460 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg45r\" (UniqueName: \"kubernetes.io/projected/0726b010-6142-410a-a353-d9c89ea8f724-kube-api-access-dg45r\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.834506 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.834519 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:25 crc kubenswrapper[4747]: I1205 22:19:25.834531 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0726b010-6142-410a-a353-d9c89ea8f724-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.378380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-trx7x" event={"ID":"0726b010-6142-410a-a353-d9c89ea8f724","Type":"ContainerDied","Data":"ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9"} Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.378461 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7b8c366b9b9695ff34ce6118814b08281f2c247b11bf11e7bcd6e30c5d7ab9" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.378508 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-trx7x" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.467511 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 22:19:26 crc kubenswrapper[4747]: E1205 22:19:26.467889 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0726b010-6142-410a-a353-d9c89ea8f724" containerName="nova-cell0-conductor-db-sync" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.467904 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0726b010-6142-410a-a353-d9c89ea8f724" containerName="nova-cell0-conductor-db-sync" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.468075 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0726b010-6142-410a-a353-d9c89ea8f724" containerName="nova-cell0-conductor-db-sync" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.468675 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.470764 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.470940 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-44ngb" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.523136 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.546916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.546999 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmjg\" (UniqueName: \"kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.547111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.650671 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.650837 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmjg\" (UniqueName: \"kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.650926 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.656102 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.663926 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.667244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmjg\" (UniqueName: \"kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg\") pod \"nova-cell0-conductor-0\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:26 crc kubenswrapper[4747]: I1205 22:19:26.831387 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:27 crc kubenswrapper[4747]: I1205 22:19:27.313983 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 22:19:27 crc kubenswrapper[4747]: I1205 22:19:27.389728 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46ffb070-be50-4f11-a1a6-ab2c958f1fb1","Type":"ContainerStarted","Data":"2240df827285bdd5668868ad97c3418945219f11ed96dc942bd3d01efee0f121"} Dec 05 22:19:28 crc kubenswrapper[4747]: I1205 22:19:28.408724 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46ffb070-be50-4f11-a1a6-ab2c958f1fb1","Type":"ContainerStarted","Data":"f54319f260021d4cfd8b47e01a0e119217532abc6e284ee4505758961ce2bea9"} Dec 05 22:19:28 crc kubenswrapper[4747]: I1205 22:19:28.409130 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:28 crc kubenswrapper[4747]: I1205 22:19:28.434320 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.434286852 podStartE2EDuration="2.434286852s" podCreationTimestamp="2025-12-05 22:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:28.429696358 +0000 UTC m=+5838.897003866" watchObservedRunningTime="2025-12-05 22:19:28.434286852 +0000 UTC m=+5838.901594390" Dec 05 22:19:34 crc kubenswrapper[4747]: I1205 22:19:34.840067 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:19:34 crc kubenswrapper[4747]: E1205 22:19:34.841308 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:19:36 crc kubenswrapper[4747]: I1205 22:19:36.878476 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.382551 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-k8mwz"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.385307 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.390606 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.390859 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.392067 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k8mwz"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.481640 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.481861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.481969 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.482140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtbn\" (UniqueName: \"kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.522133 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.523699 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.531389 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.538019 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.585530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.585937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.586058 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.586173 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.586305 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbtqx\" (UniqueName: \"kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.586471 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtbn\" (UniqueName: \"kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.586632 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.597399 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.602362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.618168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.639380 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtbn\" (UniqueName: \"kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn\") pod \"nova-cell0-cell-mapping-k8mwz\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.665407 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.666988 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.679097 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688391 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688451 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbtqx\" (UniqueName: \"kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688555 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688593 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.688714 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5dp\" (UniqueName: \"kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.692428 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.695145 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.704128 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.715011 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.753328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbtqx\" (UniqueName: \"kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.790243 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.790688 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5dp\" (UniqueName: \"kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.790791 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.796341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.803219 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.847306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.908403 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5dp\" (UniqueName: \"kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp\") pod \"nova-scheduler-0\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.947726 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.957104 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.983358 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.983456 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.989772 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:37 crc kubenswrapper[4747]: I1205 22:19:37.994294 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.000193 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.000789 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.035616 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.109547 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.111238 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115732 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115781 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115815 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frr4d\" (UniqueName: \"kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115833 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfp9\" (UniqueName: \"kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115923 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.115955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.116003 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.139357 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220120 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tzr\" (UniqueName: \"kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220165 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220205 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frr4d\" (UniqueName: \"kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220301 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220327 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfp9\" (UniqueName: \"kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.220382 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.222991 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.223406 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.225496 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.226119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.227446 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.228681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.243769 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frr4d\" (UniqueName: \"kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d\") pod \"nova-metadata-0\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.244413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfp9\" (UniqueName: \"kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9\") pod \"nova-api-0\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.317888 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.321759 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tzr\" (UniqueName: \"kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.321808 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.321850 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.321893 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.321932 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.323760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.323820 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.323970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.324270 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.331092 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.352065 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tzr\" (UniqueName: \"kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr\") pod \"dnsmasq-dns-869b45bd99-2zfkl\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.447049 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.486964 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k8mwz"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.548265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k8mwz" event={"ID":"d8688ad0-be6e-4597-9557-5c9405b2c2a8","Type":"ContainerStarted","Data":"19a375b15b7ac3c697816aa7c30959e7d2aafaa1466cc08e3bf57551e9867a43"} Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.659742 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:38 crc kubenswrapper[4747]: W1205 22:19:38.683453 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd8a4816_7cdb_40db_86d6_e56911f783c8.slice/crio-c6dc3dc04559532daec712a2335d267b6e6fc9e6cc4c79cbac3e5d3b79c344cb WatchSource:0}: Error finding container c6dc3dc04559532daec712a2335d267b6e6fc9e6cc4c79cbac3e5d3b79c344cb: Status 404 returned error can't find the container with id c6dc3dc04559532daec712a2335d267b6e6fc9e6cc4c79cbac3e5d3b79c344cb Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.731264 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:19:38 crc kubenswrapper[4747]: W1205 22:19:38.758088 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode56368a4_3747_4184_ade6_e1224146cdb2.slice/crio-1f1c626ee17249e40d1911edcf6fbbd249421c8b7ebd8cc12d44263edad00c1d WatchSource:0}: Error finding container 1f1c626ee17249e40d1911edcf6fbbd249421c8b7ebd8cc12d44263edad00c1d: Status 404 returned error can't find the container with id 1f1c626ee17249e40d1911edcf6fbbd249421c8b7ebd8cc12d44263edad00c1d Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.764080 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85bb5"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.771273 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.775378 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.775703 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.806629 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85bb5"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.937052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.937124 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.937156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.937201 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8lwc\" (UniqueName: \"kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.966454 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:38 crc kubenswrapper[4747]: I1205 22:19:38.982539 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.039172 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.039299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.039388 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.039468 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8lwc\" (UniqueName: \"kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.045976 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.046140 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.046845 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.067687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8lwc\" (UniqueName: \"kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc\") pod \"nova-cell1-conductor-db-sync-85bb5\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.117632 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:19:39 crc kubenswrapper[4747]: W1205 22:19:39.118719 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7b89dd1_f703_45f9_8a1a_47263f26a301.slice/crio-de538b2e091e47149ccb0fff740cec6e0fb2307a8920ff4f58096f86bdaf167a WatchSource:0}: Error finding container de538b2e091e47149ccb0fff740cec6e0fb2307a8920ff4f58096f86bdaf167a: Status 404 returned error can't find the container with id de538b2e091e47149ccb0fff740cec6e0fb2307a8920ff4f58096f86bdaf167a Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.163715 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.570464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e56368a4-3747-4184-ade6-e1224146cdb2","Type":"ContainerStarted","Data":"5e46c6e687155dc6d0c77d64d2ce50da7424e67938411267722bd9677624b3a0"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.571203 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e56368a4-3747-4184-ade6-e1224146cdb2","Type":"ContainerStarted","Data":"1f1c626ee17249e40d1911edcf6fbbd249421c8b7ebd8cc12d44263edad00c1d"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.579507 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerStarted","Data":"5f15be8984645f526ff4a9244bb16a7027464c168ef16c84d57b3ff473de2fe4"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.579602 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerStarted","Data":"56ddedcf05ab31e2e458fc56d7333e1c6f41cedd15c427f2e1724672f8c9ecca"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.579617 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerStarted","Data":"1868f13456b8bdbd7148532588bf3314cd343f35725c47dfa3a3657ae2927df6"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.583132 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cd8a4816-7cdb-40db-86d6-e56911f783c8","Type":"ContainerStarted","Data":"4cc4b23e304acdc48857ee8561556697fc952931ae71b83ce9c38e5ee82da8d0"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.583175 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cd8a4816-7cdb-40db-86d6-e56911f783c8","Type":"ContainerStarted","Data":"c6dc3dc04559532daec712a2335d267b6e6fc9e6cc4c79cbac3e5d3b79c344cb"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.591619 4747 generic.go:334] "Generic (PLEG): container finished" podID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerID="adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769" exitCode=0 Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.591735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" event={"ID":"f7b89dd1-f703-45f9-8a1a-47263f26a301","Type":"ContainerDied","Data":"adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.591768 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" event={"ID":"f7b89dd1-f703-45f9-8a1a-47263f26a301","Type":"ContainerStarted","Data":"de538b2e091e47149ccb0fff740cec6e0fb2307a8920ff4f58096f86bdaf167a"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.597901 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.597878322 podStartE2EDuration="2.597878322s" podCreationTimestamp="2025-12-05 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:39.592782976 +0000 UTC m=+5850.060090464" watchObservedRunningTime="2025-12-05 22:19:39.597878322 +0000 UTC m=+5850.065185830" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.603981 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k8mwz" event={"ID":"d8688ad0-be6e-4597-9557-5c9405b2c2a8","Type":"ContainerStarted","Data":"14af5fea711bf2cb847f9d0ce29f78277e6069d1808621cbdd0445faaed4b373"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.606792 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerStarted","Data":"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.606821 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerStarted","Data":"7e863db875955ce99adf91371678e8e8754fef673848f64cb366185716f37c02"} Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.617424 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6174070560000002 podStartE2EDuration="2.617407056s" podCreationTimestamp="2025-12-05 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:39.612979276 +0000 UTC m=+5850.080286764" watchObservedRunningTime="2025-12-05 22:19:39.617407056 +0000 UTC m=+5850.084714544" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.725065 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85bb5"] Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.725417 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.725399913 podStartE2EDuration="2.725399913s" podCreationTimestamp="2025-12-05 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:39.71518669 +0000 UTC m=+5850.182494178" watchObservedRunningTime="2025-12-05 22:19:39.725399913 +0000 UTC m=+5850.192707401" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.785165 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-k8mwz" podStartSLOduration=2.785140453 podStartE2EDuration="2.785140453s" podCreationTimestamp="2025-12-05 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:39.739848101 +0000 UTC m=+5850.207155599" watchObservedRunningTime="2025-12-05 22:19:39.785140453 +0000 UTC m=+5850.252447941" Dec 05 22:19:39 crc kubenswrapper[4747]: I1205 22:19:39.901349 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.901319703 podStartE2EDuration="2.901319703s" podCreationTimestamp="2025-12-05 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:39.792316031 +0000 UTC m=+5850.259623519" watchObservedRunningTime="2025-12-05 22:19:39.901319703 +0000 UTC m=+5850.368627191" Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.618055 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" event={"ID":"f7b89dd1-f703-45f9-8a1a-47263f26a301","Type":"ContainerStarted","Data":"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23"} Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.618363 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.620506 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85bb5" event={"ID":"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a","Type":"ContainerStarted","Data":"48128a43c59fe40eab9735127a94de1219f7001fb84a63e45f5dfbf8e66ad266"} Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.620539 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85bb5" event={"ID":"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a","Type":"ContainerStarted","Data":"a55181b3fabba8784e383988f399f46ffc9327e9befbb5c43f50d3c11a2b11ad"} Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.623010 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerStarted","Data":"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79"} Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.646313 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" podStartSLOduration=2.646296767 podStartE2EDuration="2.646296767s" podCreationTimestamp="2025-12-05 22:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:40.644061731 +0000 UTC m=+5851.111369219" watchObservedRunningTime="2025-12-05 22:19:40.646296767 +0000 UTC m=+5851.113604255" Dec 05 22:19:40 crc kubenswrapper[4747]: I1205 22:19:40.673753 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-85bb5" podStartSLOduration=2.673735877 podStartE2EDuration="2.673735877s" podCreationTimestamp="2025-12-05 22:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:40.669804589 +0000 UTC m=+5851.137112077" watchObservedRunningTime="2025-12-05 22:19:40.673735877 +0000 UTC m=+5851.141043355" Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.025247 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.037898 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.038091 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cd8a4816-7cdb-40db-86d6-e56911f783c8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4cc4b23e304acdc48857ee8561556697fc952931ae71b83ce9c38e5ee82da8d0" gracePeriod=30 Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.646899 4747 generic.go:334] "Generic (PLEG): container finished" podID="cd8a4816-7cdb-40db-86d6-e56911f783c8" containerID="4cc4b23e304acdc48857ee8561556697fc952931ae71b83ce9c38e5ee82da8d0" exitCode=0 Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.646965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cd8a4816-7cdb-40db-86d6-e56911f783c8","Type":"ContainerDied","Data":"4cc4b23e304acdc48857ee8561556697fc952931ae71b83ce9c38e5ee82da8d0"} Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.647088 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-log" containerID="cri-o://f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" gracePeriod=30 Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.647118 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-metadata" containerID="cri-o://b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" gracePeriod=30 Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.849648 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:42 crc kubenswrapper[4747]: I1205 22:19:42.947998 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.014968 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.094534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbtqx\" (UniqueName: \"kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx\") pod \"cd8a4816-7cdb-40db-86d6-e56911f783c8\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.094643 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data\") pod \"cd8a4816-7cdb-40db-86d6-e56911f783c8\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.094700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle\") pod \"cd8a4816-7cdb-40db-86d6-e56911f783c8\" (UID: \"cd8a4816-7cdb-40db-86d6-e56911f783c8\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.102934 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx" (OuterVolumeSpecName: "kube-api-access-vbtqx") pod "cd8a4816-7cdb-40db-86d6-e56911f783c8" (UID: "cd8a4816-7cdb-40db-86d6-e56911f783c8"). InnerVolumeSpecName "kube-api-access-vbtqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.132169 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data" (OuterVolumeSpecName: "config-data") pod "cd8a4816-7cdb-40db-86d6-e56911f783c8" (UID: "cd8a4816-7cdb-40db-86d6-e56911f783c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.162938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd8a4816-7cdb-40db-86d6-e56911f783c8" (UID: "cd8a4816-7cdb-40db-86d6-e56911f783c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.197218 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.197259 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8a4816-7cdb-40db-86d6-e56911f783c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.197275 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbtqx\" (UniqueName: \"kubernetes.io/projected/cd8a4816-7cdb-40db-86d6-e56911f783c8-kube-api-access-vbtqx\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.223693 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.463370 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frr4d\" (UniqueName: \"kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d\") pod \"7a9d3e33-541c-429a-83e9-8042cf36b499\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.463826 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data\") pod \"7a9d3e33-541c-429a-83e9-8042cf36b499\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.463928 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs\") pod \"7a9d3e33-541c-429a-83e9-8042cf36b499\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.464195 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs" (OuterVolumeSpecName: "logs") pod "7a9d3e33-541c-429a-83e9-8042cf36b499" (UID: "7a9d3e33-541c-429a-83e9-8042cf36b499"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.464362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle\") pod \"7a9d3e33-541c-429a-83e9-8042cf36b499\" (UID: \"7a9d3e33-541c-429a-83e9-8042cf36b499\") " Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.465270 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9d3e33-541c-429a-83e9-8042cf36b499-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.466334 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d" (OuterVolumeSpecName: "kube-api-access-frr4d") pod "7a9d3e33-541c-429a-83e9-8042cf36b499" (UID: "7a9d3e33-541c-429a-83e9-8042cf36b499"). InnerVolumeSpecName "kube-api-access-frr4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.488520 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data" (OuterVolumeSpecName: "config-data") pod "7a9d3e33-541c-429a-83e9-8042cf36b499" (UID: "7a9d3e33-541c-429a-83e9-8042cf36b499"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.491229 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a9d3e33-541c-429a-83e9-8042cf36b499" (UID: "7a9d3e33-541c-429a-83e9-8042cf36b499"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.566259 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.566297 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9d3e33-541c-429a-83e9-8042cf36b499-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.566309 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frr4d\" (UniqueName: \"kubernetes.io/projected/7a9d3e33-541c-429a-83e9-8042cf36b499-kube-api-access-frr4d\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.657354 4747 generic.go:334] "Generic (PLEG): container finished" podID="75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" containerID="48128a43c59fe40eab9735127a94de1219f7001fb84a63e45f5dfbf8e66ad266" exitCode=0 Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.657417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85bb5" event={"ID":"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a","Type":"ContainerDied","Data":"48128a43c59fe40eab9735127a94de1219f7001fb84a63e45f5dfbf8e66ad266"} Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659654 4747 generic.go:334] "Generic (PLEG): container finished" podID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerID="b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" exitCode=0 Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659681 4747 generic.go:334] "Generic (PLEG): container finished" podID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerID="f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" exitCode=143 Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerDied","Data":"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79"} Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659732 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerDied","Data":"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db"} Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659743 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a9d3e33-541c-429a-83e9-8042cf36b499","Type":"ContainerDied","Data":"7e863db875955ce99adf91371678e8e8754fef673848f64cb366185716f37c02"} Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659757 4747 scope.go:117] "RemoveContainer" containerID="b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.659854 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.668124 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cd8a4816-7cdb-40db-86d6-e56911f783c8","Type":"ContainerDied","Data":"c6dc3dc04559532daec712a2335d267b6e6fc9e6cc4c79cbac3e5d3b79c344cb"} Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.668226 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.695718 4747 scope.go:117] "RemoveContainer" containerID="f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.706808 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.723734 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.732758 4747 scope.go:117] "RemoveContainer" containerID="b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" Dec 05 22:19:43 crc kubenswrapper[4747]: E1205 22:19:43.738119 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79\": container with ID starting with b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79 not found: ID does not exist" containerID="b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.738176 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79"} err="failed to get container status \"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79\": rpc error: code = NotFound desc = could not find container \"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79\": container with ID starting with b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79 not found: ID does not exist" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.738207 4747 scope.go:117] "RemoveContainer" containerID="f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.742753 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: E1205 22:19:43.744324 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db\": container with ID starting with f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db not found: ID does not exist" containerID="f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744371 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db"} err="failed to get container status \"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db\": rpc error: code = NotFound desc = could not find container \"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db\": container with ID starting with f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db not found: ID does not exist" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744401 4747 scope.go:117] "RemoveContainer" containerID="b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744714 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79"} err="failed to get container status \"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79\": rpc error: code = NotFound desc = could not find container \"b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79\": container with ID starting with b236897cce48368cdea258db8006cfb8aff59e51aeb896020f7968f20ede3c79 not found: ID does not exist" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744738 4747 scope.go:117] "RemoveContainer" containerID="f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744929 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db"} err="failed to get container status \"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db\": rpc error: code = NotFound desc = could not find container \"f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db\": container with ID starting with f1096c62d1b12c6653eaac55a2676d38a2988bf75c4b7dfc8735fc9b16fcb4db not found: ID does not exist" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.744950 4747 scope.go:117] "RemoveContainer" containerID="4cc4b23e304acdc48857ee8561556697fc952931ae71b83ce9c38e5ee82da8d0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.754648 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.763908 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: E1205 22:19:43.764418 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-metadata" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764437 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-metadata" Dec 05 22:19:43 crc kubenswrapper[4747]: E1205 22:19:43.764456 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-log" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764465 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-log" Dec 05 22:19:43 crc kubenswrapper[4747]: E1205 22:19:43.764481 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8a4816-7cdb-40db-86d6-e56911f783c8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764489 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8a4816-7cdb-40db-86d6-e56911f783c8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764719 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8a4816-7cdb-40db-86d6-e56911f783c8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764748 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-log" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.764767 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" containerName="nova-metadata-metadata" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.766048 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.768760 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.769018 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.769787 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.769943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.770135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.770160 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.770184 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzhb9\" (UniqueName: \"kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.775886 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.777206 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.778988 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.779141 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.779361 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.789546 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.794743 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.850857 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9d3e33-541c-429a-83e9-8042cf36b499" path="/var/lib/kubelet/pods/7a9d3e33-541c-429a-83e9-8042cf36b499/volumes" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.851418 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8a4816-7cdb-40db-86d6-e56911f783c8" path="/var/lib/kubelet/pods/cd8a4816-7cdb-40db-86d6-e56911f783c8/volumes" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872082 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872221 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872248 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm2dr\" (UniqueName: \"kubernetes.io/projected/4f738e79-5b6d-4d71-b4c0-44f166659c2d-kube-api-access-dm2dr\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872353 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzhb9\" (UniqueName: \"kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.872425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.873151 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.876010 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.876775 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.877414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.897422 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzhb9\" (UniqueName: \"kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9\") pod \"nova-metadata-0\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " pod="openstack/nova-metadata-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.974786 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.974857 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.974887 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.974957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.974996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm2dr\" (UniqueName: \"kubernetes.io/projected/4f738e79-5b6d-4d71-b4c0-44f166659c2d-kube-api-access-dm2dr\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.978398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.979032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.979228 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:43 crc kubenswrapper[4747]: I1205 22:19:43.979655 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f738e79-5b6d-4d71-b4c0-44f166659c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.004431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm2dr\" (UniqueName: \"kubernetes.io/projected/4f738e79-5b6d-4d71-b4c0-44f166659c2d-kube-api-access-dm2dr\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f738e79-5b6d-4d71-b4c0-44f166659c2d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.116688 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.129902 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.578210 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:44 crc kubenswrapper[4747]: W1205 22:19:44.587902 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89ef1056_80e8_45d4_9476_b4af61ad3fcc.slice/crio-cf675cad9910fbef201c13a410200830303b32164deb9b52bac8db347bf6ddcd WatchSource:0}: Error finding container cf675cad9910fbef201c13a410200830303b32164deb9b52bac8db347bf6ddcd: Status 404 returned error can't find the container with id cf675cad9910fbef201c13a410200830303b32164deb9b52bac8db347bf6ddcd Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.649795 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 22:19:44 crc kubenswrapper[4747]: W1205 22:19:44.651334 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f738e79_5b6d_4d71_b4c0_44f166659c2d.slice/crio-523bfafa2cdfcc7c7af3c53e5df5a6fe5857da851adbfcd1be9913050a86c9eb WatchSource:0}: Error finding container 523bfafa2cdfcc7c7af3c53e5df5a6fe5857da851adbfcd1be9913050a86c9eb: Status 404 returned error can't find the container with id 523bfafa2cdfcc7c7af3c53e5df5a6fe5857da851adbfcd1be9913050a86c9eb Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.680109 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f738e79-5b6d-4d71-b4c0-44f166659c2d","Type":"ContainerStarted","Data":"523bfafa2cdfcc7c7af3c53e5df5a6fe5857da851adbfcd1be9913050a86c9eb"} Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.696204 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerStarted","Data":"cf675cad9910fbef201c13a410200830303b32164deb9b52bac8db347bf6ddcd"} Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.698187 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8688ad0-be6e-4597-9557-5c9405b2c2a8" containerID="14af5fea711bf2cb847f9d0ce29f78277e6069d1808621cbdd0445faaed4b373" exitCode=0 Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.698258 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k8mwz" event={"ID":"d8688ad0-be6e-4597-9557-5c9405b2c2a8","Type":"ContainerDied","Data":"14af5fea711bf2cb847f9d0ce29f78277e6069d1808621cbdd0445faaed4b373"} Dec 05 22:19:44 crc kubenswrapper[4747]: I1205 22:19:44.936780 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.097327 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts\") pod \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.098199 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle\") pod \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.098281 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8lwc\" (UniqueName: \"kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc\") pod \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.098447 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data\") pod \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\" (UID: \"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a\") " Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.102761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts" (OuterVolumeSpecName: "scripts") pod "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" (UID: "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.102908 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc" (OuterVolumeSpecName: "kube-api-access-m8lwc") pod "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" (UID: "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a"). InnerVolumeSpecName "kube-api-access-m8lwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.134847 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data" (OuterVolumeSpecName: "config-data") pod "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" (UID: "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.135489 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" (UID: "75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.201077 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.201132 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8lwc\" (UniqueName: \"kubernetes.io/projected/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-kube-api-access-m8lwc\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.201154 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.201172 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.717652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerStarted","Data":"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a"} Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.717866 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerStarted","Data":"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233"} Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.727679 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85bb5" event={"ID":"75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a","Type":"ContainerDied","Data":"a55181b3fabba8784e383988f399f46ffc9327e9befbb5c43f50d3c11a2b11ad"} Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.727918 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55181b3fabba8784e383988f399f46ffc9327e9befbb5c43f50d3c11a2b11ad" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.728023 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85bb5" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.743216 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f738e79-5b6d-4d71-b4c0-44f166659c2d","Type":"ContainerStarted","Data":"64f663822d7d1b7fa1ee3c3e204888f258fd40b4eb67808172579bacc6c93a0a"} Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.752136 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7521094 podStartE2EDuration="2.7521094s" podCreationTimestamp="2025-12-05 22:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:45.744195014 +0000 UTC m=+5856.211502502" watchObservedRunningTime="2025-12-05 22:19:45.7521094 +0000 UTC m=+5856.219416888" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.778394 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.778377171 podStartE2EDuration="2.778377171s" podCreationTimestamp="2025-12-05 22:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:45.767123602 +0000 UTC m=+5856.234431080" watchObservedRunningTime="2025-12-05 22:19:45.778377171 +0000 UTC m=+5856.245684659" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.803072 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 22:19:45 crc kubenswrapper[4747]: E1205 22:19:45.803712 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" containerName="nova-cell1-conductor-db-sync" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.803728 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" containerName="nova-cell1-conductor-db-sync" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.803896 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" containerName="nova-cell1-conductor-db-sync" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.804520 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.807036 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.811698 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.825077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rtp\" (UniqueName: \"kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.825132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.825244 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.927905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rtp\" (UniqueName: \"kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.927958 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.927987 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.934813 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.936928 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:45 crc kubenswrapper[4747]: I1205 22:19:45.943669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rtp\" (UniqueName: \"kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp\") pod \"nova-cell1-conductor-0\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.136697 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.140032 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.335512 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data\") pod \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.335657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle\") pod \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.335943 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts\") pod \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.337102 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwtbn\" (UniqueName: \"kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn\") pod \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\" (UID: \"d8688ad0-be6e-4597-9557-5c9405b2c2a8\") " Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.342253 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts" (OuterVolumeSpecName: "scripts") pod "d8688ad0-be6e-4597-9557-5c9405b2c2a8" (UID: "d8688ad0-be6e-4597-9557-5c9405b2c2a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.343018 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn" (OuterVolumeSpecName: "kube-api-access-fwtbn") pod "d8688ad0-be6e-4597-9557-5c9405b2c2a8" (UID: "d8688ad0-be6e-4597-9557-5c9405b2c2a8"). InnerVolumeSpecName "kube-api-access-fwtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.381467 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data" (OuterVolumeSpecName: "config-data") pod "d8688ad0-be6e-4597-9557-5c9405b2c2a8" (UID: "d8688ad0-be6e-4597-9557-5c9405b2c2a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.395759 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8688ad0-be6e-4597-9557-5c9405b2c2a8" (UID: "d8688ad0-be6e-4597-9557-5c9405b2c2a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.398730 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 22:19:46 crc kubenswrapper[4747]: W1205 22:19:46.403849 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d56e61d_3374_4b9b_8062_98ece9f4cb96.slice/crio-582837c99b3b5df5c9c0d3cda74b5facfd46862decac69e120f70096282b2107 WatchSource:0}: Error finding container 582837c99b3b5df5c9c0d3cda74b5facfd46862decac69e120f70096282b2107: Status 404 returned error can't find the container with id 582837c99b3b5df5c9c0d3cda74b5facfd46862decac69e120f70096282b2107 Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.439910 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.439941 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwtbn\" (UniqueName: \"kubernetes.io/projected/d8688ad0-be6e-4597-9557-5c9405b2c2a8-kube-api-access-fwtbn\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.439951 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.439960 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8688ad0-be6e-4597-9557-5c9405b2c2a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.750783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2d56e61d-3374-4b9b-8062-98ece9f4cb96","Type":"ContainerStarted","Data":"dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825"} Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.751281 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.751310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2d56e61d-3374-4b9b-8062-98ece9f4cb96","Type":"ContainerStarted","Data":"582837c99b3b5df5c9c0d3cda74b5facfd46862decac69e120f70096282b2107"} Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.752889 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k8mwz" event={"ID":"d8688ad0-be6e-4597-9557-5c9405b2c2a8","Type":"ContainerDied","Data":"19a375b15b7ac3c697816aa7c30959e7d2aafaa1466cc08e3bf57551e9867a43"} Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.752970 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19a375b15b7ac3c697816aa7c30959e7d2aafaa1466cc08e3bf57551e9867a43" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.753108 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k8mwz" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.776169 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.776152849 podStartE2EDuration="1.776152849s" podCreationTimestamp="2025-12-05 22:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:46.775063772 +0000 UTC m=+5857.242371260" watchObservedRunningTime="2025-12-05 22:19:46.776152849 +0000 UTC m=+5857.243460337" Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.973150 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.973602 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-api" containerID="cri-o://5f15be8984645f526ff4a9244bb16a7027464c168ef16c84d57b3ff473de2fe4" gracePeriod=30 Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.973746 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-log" containerID="cri-o://56ddedcf05ab31e2e458fc56d7333e1c6f41cedd15c427f2e1724672f8c9ecca" gracePeriod=30 Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.989612 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:19:46 crc kubenswrapper[4747]: I1205 22:19:46.989901 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e56368a4-3747-4184-ade6-e1224146cdb2" containerName="nova-scheduler-scheduler" containerID="cri-o://5e46c6e687155dc6d0c77d64d2ce50da7424e67938411267722bd9677624b3a0" gracePeriod=30 Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.008682 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.766555 4747 generic.go:334] "Generic (PLEG): container finished" podID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerID="5f15be8984645f526ff4a9244bb16a7027464c168ef16c84d57b3ff473de2fe4" exitCode=0 Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.766791 4747 generic.go:334] "Generic (PLEG): container finished" podID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerID="56ddedcf05ab31e2e458fc56d7333e1c6f41cedd15c427f2e1724672f8c9ecca" exitCode=143 Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.767369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerDied","Data":"5f15be8984645f526ff4a9244bb16a7027464c168ef16c84d57b3ff473de2fe4"} Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.767391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerDied","Data":"56ddedcf05ab31e2e458fc56d7333e1c6f41cedd15c427f2e1724672f8c9ecca"} Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.767514 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-log" containerID="cri-o://2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" gracePeriod=30 Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.768042 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-metadata" containerID="cri-o://1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" gracePeriod=30 Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.921018 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.991876 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data\") pod \"588f49ba-66cf-4855-a84b-341e5b7b7b78\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.992044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle\") pod \"588f49ba-66cf-4855-a84b-341e5b7b7b78\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.992132 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfp9\" (UniqueName: \"kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9\") pod \"588f49ba-66cf-4855-a84b-341e5b7b7b78\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.992185 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs\") pod \"588f49ba-66cf-4855-a84b-341e5b7b7b78\" (UID: \"588f49ba-66cf-4855-a84b-341e5b7b7b78\") " Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.992858 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs" (OuterVolumeSpecName: "logs") pod "588f49ba-66cf-4855-a84b-341e5b7b7b78" (UID: "588f49ba-66cf-4855-a84b-341e5b7b7b78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.993142 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588f49ba-66cf-4855-a84b-341e5b7b7b78-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:47 crc kubenswrapper[4747]: I1205 22:19:47.997921 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9" (OuterVolumeSpecName: "kube-api-access-mkfp9") pod "588f49ba-66cf-4855-a84b-341e5b7b7b78" (UID: "588f49ba-66cf-4855-a84b-341e5b7b7b78"). InnerVolumeSpecName "kube-api-access-mkfp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.040994 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data" (OuterVolumeSpecName: "config-data") pod "588f49ba-66cf-4855-a84b-341e5b7b7b78" (UID: "588f49ba-66cf-4855-a84b-341e5b7b7b78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.055818 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "588f49ba-66cf-4855-a84b-341e5b7b7b78" (UID: "588f49ba-66cf-4855-a84b-341e5b7b7b78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.097058 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.097097 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkfp9\" (UniqueName: \"kubernetes.io/projected/588f49ba-66cf-4855-a84b-341e5b7b7b78-kube-api-access-mkfp9\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.097113 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588f49ba-66cf-4855-a84b-341e5b7b7b78-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.272333 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.299788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle\") pod \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.299882 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data\") pod \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.299913 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs\") pod \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.300044 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs\") pod \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.300080 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzhb9\" (UniqueName: \"kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9\") pod \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\" (UID: \"89ef1056-80e8-45d4-9476-b4af61ad3fcc\") " Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.300952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs" (OuterVolumeSpecName: "logs") pod "89ef1056-80e8-45d4-9476-b4af61ad3fcc" (UID: "89ef1056-80e8-45d4-9476-b4af61ad3fcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.304622 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9" (OuterVolumeSpecName: "kube-api-access-kzhb9") pod "89ef1056-80e8-45d4-9476-b4af61ad3fcc" (UID: "89ef1056-80e8-45d4-9476-b4af61ad3fcc"). InnerVolumeSpecName "kube-api-access-kzhb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.327245 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data" (OuterVolumeSpecName: "config-data") pod "89ef1056-80e8-45d4-9476-b4af61ad3fcc" (UID: "89ef1056-80e8-45d4-9476-b4af61ad3fcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.347230 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "89ef1056-80e8-45d4-9476-b4af61ad3fcc" (UID: "89ef1056-80e8-45d4-9476-b4af61ad3fcc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.347978 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89ef1056-80e8-45d4-9476-b4af61ad3fcc" (UID: "89ef1056-80e8-45d4-9476-b4af61ad3fcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.401989 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.402032 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.402044 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ef1056-80e8-45d4-9476-b4af61ad3fcc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.402059 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ef1056-80e8-45d4-9476-b4af61ad3fcc-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.402071 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzhb9\" (UniqueName: \"kubernetes.io/projected/89ef1056-80e8-45d4-9476-b4af61ad3fcc-kube-api-access-kzhb9\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.449722 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.510528 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.510798 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-765569f867-gzd9x" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="dnsmasq-dns" containerID="cri-o://ebdf0dc680dd9ce06ed465a2f78ea69d3861f395c0d634104d76f1da822c9aa7" gracePeriod=10 Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.777362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"588f49ba-66cf-4855-a84b-341e5b7b7b78","Type":"ContainerDied","Data":"1868f13456b8bdbd7148532588bf3314cd343f35725c47dfa3a3657ae2927df6"} Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.777407 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.777436 4747 scope.go:117] "RemoveContainer" containerID="5f15be8984645f526ff4a9244bb16a7027464c168ef16c84d57b3ff473de2fe4" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.782572 4747 generic.go:334] "Generic (PLEG): container finished" podID="072d8f67-dbec-4764-8817-7948bb98f600" containerID="ebdf0dc680dd9ce06ed465a2f78ea69d3861f395c0d634104d76f1da822c9aa7" exitCode=0 Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.782621 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765569f867-gzd9x" event={"ID":"072d8f67-dbec-4764-8817-7948bb98f600","Type":"ContainerDied","Data":"ebdf0dc680dd9ce06ed465a2f78ea69d3861f395c0d634104d76f1da822c9aa7"} Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788657 4747 generic.go:334] "Generic (PLEG): container finished" podID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerID="1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" exitCode=0 Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788712 4747 generic.go:334] "Generic (PLEG): container finished" podID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerID="2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" exitCode=143 Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788734 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerDied","Data":"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a"} Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerDied","Data":"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233"} Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788794 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"89ef1056-80e8-45d4-9476-b4af61ad3fcc","Type":"ContainerDied","Data":"cf675cad9910fbef201c13a410200830303b32164deb9b52bac8db347bf6ddcd"} Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.788878 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.797802 4747 scope.go:117] "RemoveContainer" containerID="56ddedcf05ab31e2e458fc56d7333e1c6f41cedd15c427f2e1724672f8c9ecca" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.867073 4747 scope.go:117] "RemoveContainer" containerID="1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.883942 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.901734 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.910487 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.912850 4747 scope.go:117] "RemoveContainer" containerID="2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.920408 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.930669 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: E1205 22:19:48.931159 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-api" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931183 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-api" Dec 05 22:19:48 crc kubenswrapper[4747]: E1205 22:19:48.931219 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-log" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931228 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-log" Dec 05 22:19:48 crc kubenswrapper[4747]: E1205 22:19:48.931245 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-log" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931254 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-log" Dec 05 22:19:48 crc kubenswrapper[4747]: E1205 22:19:48.931279 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8688ad0-be6e-4597-9557-5c9405b2c2a8" containerName="nova-manage" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931288 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8688ad0-be6e-4597-9557-5c9405b2c2a8" containerName="nova-manage" Dec 05 22:19:48 crc kubenswrapper[4747]: E1205 22:19:48.931301 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-metadata" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931309 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-metadata" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931550 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-log" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931595 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-log" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931628 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" containerName="nova-api-api" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931639 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" containerName="nova-metadata-metadata" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.931647 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8688ad0-be6e-4597-9557-5c9405b2c2a8" containerName="nova-manage" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.932914 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.936077 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.965679 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.968612 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.972960 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.973198 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.978393 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:48 crc kubenswrapper[4747]: I1205 22:19:48.987753 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmsl5\" (UniqueName: \"kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014741 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014847 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrbb\" (UniqueName: \"kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.014863 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.015021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.015113 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.044562 4747 scope.go:117] "RemoveContainer" containerID="1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" Dec 05 22:19:49 crc kubenswrapper[4747]: E1205 22:19:49.046126 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a\": container with ID starting with 1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a not found: ID does not exist" containerID="1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.046158 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a"} err="failed to get container status \"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a\": rpc error: code = NotFound desc = could not find container \"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a\": container with ID starting with 1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a not found: ID does not exist" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.046178 4747 scope.go:117] "RemoveContainer" containerID="2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.046772 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:19:49 crc kubenswrapper[4747]: E1205 22:19:49.047698 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233\": container with ID starting with 2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233 not found: ID does not exist" containerID="2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.047728 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233"} err="failed to get container status \"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233\": rpc error: code = NotFound desc = could not find container \"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233\": container with ID starting with 2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233 not found: ID does not exist" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.047751 4747 scope.go:117] "RemoveContainer" containerID="1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.048327 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a"} err="failed to get container status \"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a\": rpc error: code = NotFound desc = could not find container \"1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a\": container with ID starting with 1e171c71f87161402f1c348b6c67fa1449891ee71ace0c63bc2272a1c003856a not found: ID does not exist" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.048393 4747 scope.go:117] "RemoveContainer" containerID="2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.048673 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233"} err="failed to get container status \"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233\": rpc error: code = NotFound desc = could not find container \"2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233\": container with ID starting with 2228ba0e6031d0c33167b749edac72460fbe5fca308ad971ab64f5dd6c033233 not found: ID does not exist" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116057 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb\") pod \"072d8f67-dbec-4764-8817-7948bb98f600\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116448 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc\") pod \"072d8f67-dbec-4764-8817-7948bb98f600\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116497 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb\") pod \"072d8f67-dbec-4764-8817-7948bb98f600\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116527 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config\") pod \"072d8f67-dbec-4764-8817-7948bb98f600\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116634 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cggcg\" (UniqueName: \"kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg\") pod \"072d8f67-dbec-4764-8817-7948bb98f600\" (UID: \"072d8f67-dbec-4764-8817-7948bb98f600\") " Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116864 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmsl5\" (UniqueName: \"kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116930 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.116960 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrbb\" (UniqueName: \"kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117075 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117120 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117168 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.117907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.118223 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.131003 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.137622 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.137653 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.138272 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.139411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmsl5\" (UniqueName: \"kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.154163 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrbb\" (UniqueName: \"kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.153997 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data\") pod \"nova-metadata-0\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.156128 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.166956 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg" (OuterVolumeSpecName: "kube-api-access-cggcg") pod "072d8f67-dbec-4764-8817-7948bb98f600" (UID: "072d8f67-dbec-4764-8817-7948bb98f600"). InnerVolumeSpecName "kube-api-access-cggcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.175485 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "072d8f67-dbec-4764-8817-7948bb98f600" (UID: "072d8f67-dbec-4764-8817-7948bb98f600"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.180211 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config" (OuterVolumeSpecName: "config") pod "072d8f67-dbec-4764-8817-7948bb98f600" (UID: "072d8f67-dbec-4764-8817-7948bb98f600"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.195635 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "072d8f67-dbec-4764-8817-7948bb98f600" (UID: "072d8f67-dbec-4764-8817-7948bb98f600"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.217798 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "072d8f67-dbec-4764-8817-7948bb98f600" (UID: "072d8f67-dbec-4764-8817-7948bb98f600"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.219322 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cggcg\" (UniqueName: \"kubernetes.io/projected/072d8f67-dbec-4764-8817-7948bb98f600-kube-api-access-cggcg\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.219617 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.219752 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.219780 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.219810 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/072d8f67-dbec-4764-8817-7948bb98f600-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.338899 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.355440 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.799420 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765569f867-gzd9x" event={"ID":"072d8f67-dbec-4764-8817-7948bb98f600","Type":"ContainerDied","Data":"becf81963b7c57ef7138be5a658257e225ecf799e3a8b22c31c074750c9f8cf2"} Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.799477 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765569f867-gzd9x" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.799771 4747 scope.go:117] "RemoveContainer" containerID="ebdf0dc680dd9ce06ed465a2f78ea69d3861f395c0d634104d76f1da822c9aa7" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.840059 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:19:49 crc kubenswrapper[4747]: E1205 22:19:49.842660 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.851332 4747 scope.go:117] "RemoveContainer" containerID="d6f0b8fdfb1f11e78b7cb7cf26ebc2b9578d47f20d98361caf5058f554f30cf4" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.857128 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588f49ba-66cf-4855-a84b-341e5b7b7b78" path="/var/lib/kubelet/pods/588f49ba-66cf-4855-a84b-341e5b7b7b78/volumes" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.858342 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ef1056-80e8-45d4-9476-b4af61ad3fcc" path="/var/lib/kubelet/pods/89ef1056-80e8-45d4-9476-b4af61ad3fcc/volumes" Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.859472 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.859562 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-765569f867-gzd9x"] Dec 05 22:19:49 crc kubenswrapper[4747]: W1205 22:19:49.868376 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db2f3d4_f3b8_4d8f_b403_cb89f9d50628.slice/crio-19cbae013e198c15e16858c12e9518eb724ed0bf1b9ab56fc27ffb774af37f76 WatchSource:0}: Error finding container 19cbae013e198c15e16858c12e9518eb724ed0bf1b9ab56fc27ffb774af37f76: Status 404 returned error can't find the container with id 19cbae013e198c15e16858c12e9518eb724ed0bf1b9ab56fc27ffb774af37f76 Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.902665 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:19:49 crc kubenswrapper[4747]: I1205 22:19:49.944135 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.833195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerStarted","Data":"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.833247 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerStarted","Data":"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.833261 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerStarted","Data":"19cbae013e198c15e16858c12e9518eb724ed0bf1b9ab56fc27ffb774af37f76"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.843602 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerStarted","Data":"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.843936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerStarted","Data":"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.843953 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerStarted","Data":"5cf35e96c83b2d00111ac229c2f2d2c4d227559312de10df620b7c876fbe1a7e"} Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.853390 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8533754 podStartE2EDuration="2.8533754s" podCreationTimestamp="2025-12-05 22:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:50.849085384 +0000 UTC m=+5861.316392872" watchObservedRunningTime="2025-12-05 22:19:50.8533754 +0000 UTC m=+5861.320682888" Dec 05 22:19:50 crc kubenswrapper[4747]: I1205 22:19:50.870021 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.870006232 podStartE2EDuration="2.870006232s" podCreationTimestamp="2025-12-05 22:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:50.864075555 +0000 UTC m=+5861.331383043" watchObservedRunningTime="2025-12-05 22:19:50.870006232 +0000 UTC m=+5861.337313720" Dec 05 22:19:51 crc kubenswrapper[4747]: I1205 22:19:51.173875 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 22:19:51 crc kubenswrapper[4747]: I1205 22:19:51.864605 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072d8f67-dbec-4764-8817-7948bb98f600" path="/var/lib/kubelet/pods/072d8f67-dbec-4764-8817-7948bb98f600/volumes" Dec 05 22:19:54 crc kubenswrapper[4747]: I1205 22:19:54.131115 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:54 crc kubenswrapper[4747]: I1205 22:19:54.149749 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:54 crc kubenswrapper[4747]: I1205 22:19:54.356860 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 22:19:54 crc kubenswrapper[4747]: I1205 22:19:54.356905 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 22:19:54 crc kubenswrapper[4747]: I1205 22:19:54.913769 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.114074 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5f8t4"] Dec 05 22:19:55 crc kubenswrapper[4747]: E1205 22:19:55.114553 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="dnsmasq-dns" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.114575 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="dnsmasq-dns" Dec 05 22:19:55 crc kubenswrapper[4747]: E1205 22:19:55.114613 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="init" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.114621 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="init" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.114836 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="072d8f67-dbec-4764-8817-7948bb98f600" containerName="dnsmasq-dns" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.115638 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.117330 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.118658 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.126318 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5f8t4"] Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.149232 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rphb\" (UniqueName: \"kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.149344 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.149429 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.149501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.251047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.251123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.251164 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rphb\" (UniqueName: \"kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.251227 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.257381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.257440 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.265635 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.271060 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rphb\" (UniqueName: \"kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb\") pod \"nova-cell1-cell-mapping-5f8t4\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.441859 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:19:55 crc kubenswrapper[4747]: W1205 22:19:55.904226 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b55883_d37f_4373_9dd6_bed4ff2b322f.slice/crio-ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7 WatchSource:0}: Error finding container ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7: Status 404 returned error can't find the container with id ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7 Dec 05 22:19:55 crc kubenswrapper[4747]: I1205 22:19:55.907794 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5f8t4"] Dec 05 22:19:56 crc kubenswrapper[4747]: I1205 22:19:56.920940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5f8t4" event={"ID":"a3b55883-d37f-4373-9dd6-bed4ff2b322f","Type":"ContainerStarted","Data":"efed805f4af9af8371b7974366e301514b073a1718293a542a547dcee1aa9c7d"} Dec 05 22:19:56 crc kubenswrapper[4747]: I1205 22:19:56.921314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5f8t4" event={"ID":"a3b55883-d37f-4373-9dd6-bed4ff2b322f","Type":"ContainerStarted","Data":"ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7"} Dec 05 22:19:56 crc kubenswrapper[4747]: I1205 22:19:56.937880 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5f8t4" podStartSLOduration=1.937862228 podStartE2EDuration="1.937862228s" podCreationTimestamp="2025-12-05 22:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:19:56.937544971 +0000 UTC m=+5867.404852469" watchObservedRunningTime="2025-12-05 22:19:56.937862228 +0000 UTC m=+5867.405169716" Dec 05 22:19:59 crc kubenswrapper[4747]: I1205 22:19:59.339877 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:19:59 crc kubenswrapper[4747]: I1205 22:19:59.340441 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:19:59 crc kubenswrapper[4747]: I1205 22:19:59.357144 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 22:19:59 crc kubenswrapper[4747]: I1205 22:19:59.358353 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.441737 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.442077 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.87:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.442627 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.442677 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.87:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.960532 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3b55883-d37f-4373-9dd6-bed4ff2b322f" containerID="efed805f4af9af8371b7974366e301514b073a1718293a542a547dcee1aa9c7d" exitCode=0 Dec 05 22:20:00 crc kubenswrapper[4747]: I1205 22:20:00.960573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5f8t4" event={"ID":"a3b55883-d37f-4373-9dd6-bed4ff2b322f","Type":"ContainerDied","Data":"efed805f4af9af8371b7974366e301514b073a1718293a542a547dcee1aa9c7d"} Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.417429 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.598148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rphb\" (UniqueName: \"kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb\") pod \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.598281 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts\") pod \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.598338 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle\") pod \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.598390 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data\") pod \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\" (UID: \"a3b55883-d37f-4373-9dd6-bed4ff2b322f\") " Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.622744 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts" (OuterVolumeSpecName: "scripts") pod "a3b55883-d37f-4373-9dd6-bed4ff2b322f" (UID: "a3b55883-d37f-4373-9dd6-bed4ff2b322f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.622817 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb" (OuterVolumeSpecName: "kube-api-access-9rphb") pod "a3b55883-d37f-4373-9dd6-bed4ff2b322f" (UID: "a3b55883-d37f-4373-9dd6-bed4ff2b322f"). InnerVolumeSpecName "kube-api-access-9rphb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.689974 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data" (OuterVolumeSpecName: "config-data") pod "a3b55883-d37f-4373-9dd6-bed4ff2b322f" (UID: "a3b55883-d37f-4373-9dd6-bed4ff2b322f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.700992 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rphb\" (UniqueName: \"kubernetes.io/projected/a3b55883-d37f-4373-9dd6-bed4ff2b322f-kube-api-access-9rphb\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.701026 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.701039 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.755690 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3b55883-d37f-4373-9dd6-bed4ff2b322f" (UID: "a3b55883-d37f-4373-9dd6-bed4ff2b322f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.802445 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b55883-d37f-4373-9dd6-bed4ff2b322f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.839393 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:20:02 crc kubenswrapper[4747]: E1205 22:20:02.839759 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.984026 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5f8t4" event={"ID":"a3b55883-d37f-4373-9dd6-bed4ff2b322f","Type":"ContainerDied","Data":"ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7"} Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.984083 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed7ec86e09d031eec66f0bc923eab48f9f6c53c819740c3586b75ffd1c2057c7" Dec 05 22:20:02 crc kubenswrapper[4747]: I1205 22:20:02.984057 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5f8t4" Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.088545 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.088942 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-api" containerID="cri-o://6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a" gracePeriod=30 Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.089377 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-log" containerID="cri-o://7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677" gracePeriod=30 Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.123923 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.124494 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-metadata" containerID="cri-o://6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2" gracePeriod=30 Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.129896 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-log" containerID="cri-o://25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe" gracePeriod=30 Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.996370 4747 generic.go:334] "Generic (PLEG): container finished" podID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerID="25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe" exitCode=143 Dec 05 22:20:03 crc kubenswrapper[4747]: I1205 22:20:03.996467 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerDied","Data":"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe"} Dec 05 22:20:04 crc kubenswrapper[4747]: I1205 22:20:03.999484 4747 generic.go:334] "Generic (PLEG): container finished" podID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerID="7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677" exitCode=143 Dec 05 22:20:04 crc kubenswrapper[4747]: I1205 22:20:03.999520 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerDied","Data":"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677"} Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.784600 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.905721 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs\") pod \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.905806 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btrbb\" (UniqueName: \"kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb\") pod \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.905895 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data\") pod \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.906022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle\") pod \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\" (UID: \"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628\") " Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.906283 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs" (OuterVolumeSpecName: "logs") pod "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" (UID: "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.907170 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.910660 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb" (OuterVolumeSpecName: "kube-api-access-btrbb") pod "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" (UID: "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628"). InnerVolumeSpecName "kube-api-access-btrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.930479 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data" (OuterVolumeSpecName: "config-data") pod "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" (UID: "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.933170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" (UID: "2db2f3d4-f3b8-4d8f-b403-cb89f9d50628"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:06 crc kubenswrapper[4747]: I1205 22:20:06.946701 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.009675 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.009796 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btrbb\" (UniqueName: \"kubernetes.io/projected/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-kube-api-access-btrbb\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.009831 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.031734 4747 generic.go:334] "Generic (PLEG): container finished" podID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerID="6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2" exitCode=0 Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.031770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerDied","Data":"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2"} Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.031807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ac31591-f652-49ad-9ccf-295d0aa5c514","Type":"ContainerDied","Data":"5cf35e96c83b2d00111ac229c2f2d2c4d227559312de10df620b7c876fbe1a7e"} Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.031824 4747 scope.go:117] "RemoveContainer" containerID="6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.031847 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.034089 4747 generic.go:334] "Generic (PLEG): container finished" podID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerID="6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a" exitCode=0 Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.034118 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerDied","Data":"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a"} Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.034133 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2db2f3d4-f3b8-4d8f-b403-cb89f9d50628","Type":"ContainerDied","Data":"19cbae013e198c15e16858c12e9518eb724ed0bf1b9ab56fc27ffb774af37f76"} Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.034152 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.054404 4747 scope.go:117] "RemoveContainer" containerID="25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.073097 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.106683 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.117622 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.118056 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-api" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118076 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-api" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.118086 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b55883-d37f-4373-9dd6-bed4ff2b322f" containerName="nova-manage" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118094 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b55883-d37f-4373-9dd6-bed4ff2b322f" containerName="nova-manage" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.118106 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-log" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118112 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-log" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.118131 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-metadata" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118137 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-metadata" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.118150 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-log" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118156 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-log" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118320 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs\") pod \"6ac31591-f652-49ad-9ccf-295d0aa5c514\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118341 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b55883-d37f-4373-9dd6-bed4ff2b322f" containerName="nova-manage" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118359 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-api" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118359 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmsl5\" (UniqueName: \"kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5\") pod \"6ac31591-f652-49ad-9ccf-295d0aa5c514\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118384 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-log" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118397 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" containerName="nova-api-log" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118409 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" containerName="nova-metadata-metadata" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118551 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data\") pod \"6ac31591-f652-49ad-9ccf-295d0aa5c514\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs\") pod \"6ac31591-f652-49ad-9ccf-295d0aa5c514\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.118675 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle\") pod \"6ac31591-f652-49ad-9ccf-295d0aa5c514\" (UID: \"6ac31591-f652-49ad-9ccf-295d0aa5c514\") " Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.119343 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.119817 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs" (OuterVolumeSpecName: "logs") pod "6ac31591-f652-49ad-9ccf-295d0aa5c514" (UID: "6ac31591-f652-49ad-9ccf-295d0aa5c514"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.127909 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.130314 4747 scope.go:117] "RemoveContainer" containerID="6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.130369 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5" (OuterVolumeSpecName: "kube-api-access-qmsl5") pod "6ac31591-f652-49ad-9ccf-295d0aa5c514" (UID: "6ac31591-f652-49ad-9ccf-295d0aa5c514"). InnerVolumeSpecName "kube-api-access-qmsl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.130776 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.130850 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2\": container with ID starting with 6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2 not found: ID does not exist" containerID="6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.130880 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2"} err="failed to get container status \"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2\": rpc error: code = NotFound desc = could not find container \"6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2\": container with ID starting with 6283261160e84935673d5f29ecfd1cbb674913a9700cc5b475a3c525d02167d2 not found: ID does not exist" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.130903 4747 scope.go:117] "RemoveContainer" containerID="25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.133067 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe\": container with ID starting with 25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe not found: ID does not exist" containerID="25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.133109 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe"} err="failed to get container status \"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe\": rpc error: code = NotFound desc = could not find container \"25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe\": container with ID starting with 25894d058d390de9d1575350dc224ca080d04994b0050ddd4544e25ed6103bbe not found: ID does not exist" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.133133 4747 scope.go:117] "RemoveContainer" containerID="6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.149941 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data" (OuterVolumeSpecName: "config-data") pod "6ac31591-f652-49ad-9ccf-295d0aa5c514" (UID: "6ac31591-f652-49ad-9ccf-295d0aa5c514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.156751 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac31591-f652-49ad-9ccf-295d0aa5c514" (UID: "6ac31591-f652-49ad-9ccf-295d0aa5c514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.171194 4747 scope.go:117] "RemoveContainer" containerID="7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.187655 4747 scope.go:117] "RemoveContainer" containerID="6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.188086 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a\": container with ID starting with 6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a not found: ID does not exist" containerID="6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.188135 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a"} err="failed to get container status \"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a\": rpc error: code = NotFound desc = could not find container \"6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a\": container with ID starting with 6bb6b16dc005150c89ec7c24bd2308b341b97255a7e4a751c38c32049a9a6c6a not found: ID does not exist" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.188166 4747 scope.go:117] "RemoveContainer" containerID="7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677" Dec 05 22:20:07 crc kubenswrapper[4747]: E1205 22:20:07.188479 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677\": container with ID starting with 7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677 not found: ID does not exist" containerID="7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.188514 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677"} err="failed to get container status \"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677\": rpc error: code = NotFound desc = could not find container \"7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677\": container with ID starting with 7431896120f7ac092d3a068b5b257fe111bd1aa5586e461077b5925ed5629677 not found: ID does not exist" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.189038 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6ac31591-f652-49ad-9ccf-295d0aa5c514" (UID: "6ac31591-f652-49ad-9ccf-295d0aa5c514"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.220965 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.221014 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.221032 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac31591-f652-49ad-9ccf-295d0aa5c514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.221044 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac31591-f652-49ad-9ccf-295d0aa5c514-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.221061 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmsl5\" (UniqueName: \"kubernetes.io/projected/6ac31591-f652-49ad-9ccf-295d0aa5c514-kube-api-access-qmsl5\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.323000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.323218 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.323287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmjd\" (UniqueName: \"kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.323341 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.426421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.426507 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.426650 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.426698 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmjd\" (UniqueName: \"kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.427253 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.430690 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.431885 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.431958 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.437840 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.451311 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.453544 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.454975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmjd\" (UniqueName: \"kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd\") pod \"nova-api-0\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.455567 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.456522 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.462676 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.528637 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xdsr\" (UniqueName: \"kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.528989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.529313 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.529537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.529773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.631953 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xdsr\" (UniqueName: \"kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.632021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.632083 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.632116 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.632209 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.633092 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.636567 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.636799 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.637551 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.677989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xdsr\" (UniqueName: \"kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr\") pod \"nova-metadata-0\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.749717 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.821139 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.866399 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db2f3d4-f3b8-4d8f-b403-cb89f9d50628" path="/var/lib/kubelet/pods/2db2f3d4-f3b8-4d8f-b403-cb89f9d50628/volumes" Dec 05 22:20:07 crc kubenswrapper[4747]: I1205 22:20:07.867066 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac31591-f652-49ad-9ccf-295d0aa5c514" path="/var/lib/kubelet/pods/6ac31591-f652-49ad-9ccf-295d0aa5c514/volumes" Dec 05 22:20:08 crc kubenswrapper[4747]: W1205 22:20:08.277155 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52e38ff9_ee98_40ad_9353_ea7d416b9389.slice/crio-749920c791127101796ff6a103aca7a4843959e34727443095485bdb4c6132f6 WatchSource:0}: Error finding container 749920c791127101796ff6a103aca7a4843959e34727443095485bdb4c6132f6: Status 404 returned error can't find the container with id 749920c791127101796ff6a103aca7a4843959e34727443095485bdb4c6132f6 Dec 05 22:20:08 crc kubenswrapper[4747]: I1205 22:20:08.277860 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:08 crc kubenswrapper[4747]: I1205 22:20:08.352058 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.060819 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerStarted","Data":"f77a9f91a96faba41eb76d1bf1fcdf75944f7141e569bb9e721b2ef8d9b12447"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.060869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerStarted","Data":"2480e7522c51b23c99de4b385425e590f4aaadfe63180019d1de74e63234aaf2"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.060885 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerStarted","Data":"9754cd8b21a0fb1f8490cfa4c4abf3a492166e7d3c24c2131504af56a54d90c6"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.062384 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerStarted","Data":"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.062436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerStarted","Data":"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.062452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerStarted","Data":"749920c791127101796ff6a103aca7a4843959e34727443095485bdb4c6132f6"} Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.117649 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.117623823 podStartE2EDuration="2.117623823s" podCreationTimestamp="2025-12-05 22:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:20:09.078908934 +0000 UTC m=+5879.546216432" watchObservedRunningTime="2025-12-05 22:20:09.117623823 +0000 UTC m=+5879.584931301" Dec 05 22:20:09 crc kubenswrapper[4747]: I1205 22:20:09.129662 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.129644811 podStartE2EDuration="2.129644811s" podCreationTimestamp="2025-12-05 22:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:20:09.105141404 +0000 UTC m=+5879.572448892" watchObservedRunningTime="2025-12-05 22:20:09.129644811 +0000 UTC m=+5879.596952299" Dec 05 22:20:12 crc kubenswrapper[4747]: I1205 22:20:12.821420 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 22:20:12 crc kubenswrapper[4747]: I1205 22:20:12.822012 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 22:20:13 crc kubenswrapper[4747]: I1205 22:20:13.841091 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:20:13 crc kubenswrapper[4747]: E1205 22:20:13.841308 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.174977 4747 generic.go:334] "Generic (PLEG): container finished" podID="e56368a4-3747-4184-ade6-e1224146cdb2" containerID="5e46c6e687155dc6d0c77d64d2ce50da7424e67938411267722bd9677624b3a0" exitCode=137 Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.175688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e56368a4-3747-4184-ade6-e1224146cdb2","Type":"ContainerDied","Data":"5e46c6e687155dc6d0c77d64d2ce50da7424e67938411267722bd9677624b3a0"} Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.434797 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.628689 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle\") pod \"e56368a4-3747-4184-ade6-e1224146cdb2\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.628768 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k5dp\" (UniqueName: \"kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp\") pod \"e56368a4-3747-4184-ade6-e1224146cdb2\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.629137 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data\") pod \"e56368a4-3747-4184-ade6-e1224146cdb2\" (UID: \"e56368a4-3747-4184-ade6-e1224146cdb2\") " Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.636852 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp" (OuterVolumeSpecName: "kube-api-access-9k5dp") pod "e56368a4-3747-4184-ade6-e1224146cdb2" (UID: "e56368a4-3747-4184-ade6-e1224146cdb2"). InnerVolumeSpecName "kube-api-access-9k5dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.663425 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data" (OuterVolumeSpecName: "config-data") pod "e56368a4-3747-4184-ade6-e1224146cdb2" (UID: "e56368a4-3747-4184-ade6-e1224146cdb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.682070 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e56368a4-3747-4184-ade6-e1224146cdb2" (UID: "e56368a4-3747-4184-ade6-e1224146cdb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.731699 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.731735 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56368a4-3747-4184-ade6-e1224146cdb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.731752 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k5dp\" (UniqueName: \"kubernetes.io/projected/e56368a4-3747-4184-ade6-e1224146cdb2-kube-api-access-9k5dp\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.751140 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.752429 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.821760 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 22:20:17 crc kubenswrapper[4747]: I1205 22:20:17.821929 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.189738 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.189765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e56368a4-3747-4184-ade6-e1224146cdb2","Type":"ContainerDied","Data":"1f1c626ee17249e40d1911edcf6fbbd249421c8b7ebd8cc12d44263edad00c1d"} Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.189835 4747 scope.go:117] "RemoveContainer" containerID="5e46c6e687155dc6d0c77d64d2ce50da7424e67938411267722bd9677624b3a0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.235710 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.238068 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.257242 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:20:18 crc kubenswrapper[4747]: E1205 22:20:18.257794 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56368a4-3747-4184-ade6-e1224146cdb2" containerName="nova-scheduler-scheduler" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.257807 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56368a4-3747-4184-ade6-e1224146cdb2" containerName="nova-scheduler-scheduler" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.257982 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56368a4-3747-4184-ade6-e1224146cdb2" containerName="nova-scheduler-scheduler" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.258686 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.268666 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.303103 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.350265 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.350524 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.350653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.452557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.452676 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.452747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.456743 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.461986 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.473616 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs\") pod \"nova-scheduler-0\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.623626 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.832893 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.89:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.882794 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.89:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.882839 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.90:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.882975 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.90:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:18 crc kubenswrapper[4747]: I1205 22:20:18.963898 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 22:20:19 crc kubenswrapper[4747]: I1205 22:20:19.207896 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5","Type":"ContainerStarted","Data":"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95"} Dec 05 22:20:19 crc kubenswrapper[4747]: I1205 22:20:19.208261 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5","Type":"ContainerStarted","Data":"1d297ddff21c1d19b996116684468cb70488e6a428c231f1f3a131f1a5737b8e"} Dec 05 22:20:19 crc kubenswrapper[4747]: I1205 22:20:19.235083 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.235064815 podStartE2EDuration="1.235064815s" podCreationTimestamp="2025-12-05 22:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:20:19.225683863 +0000 UTC m=+5889.692991351" watchObservedRunningTime="2025-12-05 22:20:19.235064815 +0000 UTC m=+5889.702372303" Dec 05 22:20:19 crc kubenswrapper[4747]: I1205 22:20:19.856780 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56368a4-3747-4184-ade6-e1224146cdb2" path="/var/lib/kubelet/pods/e56368a4-3747-4184-ade6-e1224146cdb2/volumes" Dec 05 22:20:23 crc kubenswrapper[4747]: I1205 22:20:23.624048 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 22:20:26 crc kubenswrapper[4747]: I1205 22:20:26.840412 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:20:26 crc kubenswrapper[4747]: E1205 22:20:26.841301 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.756200 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.757293 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.762702 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.772270 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.829429 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.829730 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.834988 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 22:20:27 crc kubenswrapper[4747]: I1205 22:20:27.836388 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.321247 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.325724 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.503375 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.504989 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.522432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.585901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.585989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.586049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqjpj\" (UniqueName: \"kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.586086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.586102 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.624268 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.647638 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.687322 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.687455 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqjpj\" (UniqueName: \"kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.687525 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.687551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.688463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.688548 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.688558 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.688771 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.689361 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.717781 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqjpj\" (UniqueName: \"kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj\") pod \"dnsmasq-dns-5f4b5ff85f-vnc8k\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:28 crc kubenswrapper[4747]: I1205 22:20:28.841206 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:29 crc kubenswrapper[4747]: I1205 22:20:29.319890 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:20:29 crc kubenswrapper[4747]: I1205 22:20:29.370476 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.191502 4747 scope.go:117] "RemoveContainer" containerID="53828885502e82f68e133a6834df84c912a0c686e9fcffd89679f461f70198a6" Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.214829 4747 scope.go:117] "RemoveContainer" containerID="cc07ffae6776d676387318ac14f879962f03ddcbc52727166f5ada2d79685c67" Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.236374 4747 scope.go:117] "RemoveContainer" containerID="aab2f19202a8ddf069010b41e0a85b3f339b295c5aa90862c8cdb5dde4caa6e2" Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.267636 4747 scope.go:117] "RemoveContainer" containerID="72f01a8ac3607ccbfef0762575b1e5d91a50cf7714be2957c57483989947a0d0" Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.351294 4747 generic.go:334] "Generic (PLEG): container finished" podID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerID="75e6d73cf1bcb20b598c2bb7fa1119dba57c4758d89eb1a7c8dcf99b203b9140" exitCode=0 Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.351383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" event={"ID":"b514da94-b08d-4ba3-b22e-f8c5e3729bd1","Type":"ContainerDied","Data":"75e6d73cf1bcb20b598c2bb7fa1119dba57c4758d89eb1a7c8dcf99b203b9140"} Dec 05 22:20:30 crc kubenswrapper[4747]: I1205 22:20:30.351422 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" event={"ID":"b514da94-b08d-4ba3-b22e-f8c5e3729bd1","Type":"ContainerStarted","Data":"5897077cfbcacc815c50b0461a683bfa43663082ca2059d460127ca878ed2007"} Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.372233 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" event={"ID":"b514da94-b08d-4ba3-b22e-f8c5e3729bd1","Type":"ContainerStarted","Data":"ab5f9a33d9b202593398e59d04d27bb004433b2c9aa1584b8d9e79fbd713b26b"} Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.373940 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.450928 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" podStartSLOduration=3.450900125 podStartE2EDuration="3.450900125s" podCreationTimestamp="2025-12-05 22:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:20:31.40795195 +0000 UTC m=+5901.875259478" watchObservedRunningTime="2025-12-05 22:20:31.450900125 +0000 UTC m=+5901.918207643" Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.459571 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.459832 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-log" containerID="cri-o://0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0" gracePeriod=30 Dec 05 22:20:31 crc kubenswrapper[4747]: I1205 22:20:31.459950 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-api" containerID="cri-o://3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684" gracePeriod=30 Dec 05 22:20:32 crc kubenswrapper[4747]: I1205 22:20:32.384385 4747 generic.go:334] "Generic (PLEG): container finished" podID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerID="0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0" exitCode=143 Dec 05 22:20:32 crc kubenswrapper[4747]: I1205 22:20:32.384472 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerDied","Data":"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0"} Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.164503 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.219436 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktmjd\" (UniqueName: \"kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd\") pod \"52e38ff9-ee98-40ad-9353-ea7d416b9389\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.219970 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle\") pod \"52e38ff9-ee98-40ad-9353-ea7d416b9389\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.220023 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data\") pod \"52e38ff9-ee98-40ad-9353-ea7d416b9389\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.220148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs\") pod \"52e38ff9-ee98-40ad-9353-ea7d416b9389\" (UID: \"52e38ff9-ee98-40ad-9353-ea7d416b9389\") " Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.220826 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs" (OuterVolumeSpecName: "logs") pod "52e38ff9-ee98-40ad-9353-ea7d416b9389" (UID: "52e38ff9-ee98-40ad-9353-ea7d416b9389"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.241754 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd" (OuterVolumeSpecName: "kube-api-access-ktmjd") pod "52e38ff9-ee98-40ad-9353-ea7d416b9389" (UID: "52e38ff9-ee98-40ad-9353-ea7d416b9389"). InnerVolumeSpecName "kube-api-access-ktmjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.245346 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52e38ff9-ee98-40ad-9353-ea7d416b9389" (UID: "52e38ff9-ee98-40ad-9353-ea7d416b9389"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.253642 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data" (OuterVolumeSpecName: "config-data") pod "52e38ff9-ee98-40ad-9353-ea7d416b9389" (UID: "52e38ff9-ee98-40ad-9353-ea7d416b9389"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.322535 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e38ff9-ee98-40ad-9353-ea7d416b9389-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.322569 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktmjd\" (UniqueName: \"kubernetes.io/projected/52e38ff9-ee98-40ad-9353-ea7d416b9389-kube-api-access-ktmjd\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.322605 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.322619 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e38ff9-ee98-40ad-9353-ea7d416b9389-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.421452 4747 generic.go:334] "Generic (PLEG): container finished" podID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerID="3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684" exitCode=0 Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.421498 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerDied","Data":"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684"} Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.421529 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52e38ff9-ee98-40ad-9353-ea7d416b9389","Type":"ContainerDied","Data":"749920c791127101796ff6a103aca7a4843959e34727443095485bdb4c6132f6"} Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.421549 4747 scope.go:117] "RemoveContainer" containerID="3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.421705 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.461240 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.464357 4747 scope.go:117] "RemoveContainer" containerID="0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.475854 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.493091 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:35 crc kubenswrapper[4747]: E1205 22:20:35.493845 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-api" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.493942 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-api" Dec 05 22:20:35 crc kubenswrapper[4747]: E1205 22:20:35.494036 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-log" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.494117 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-log" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.494461 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-api" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.494557 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" containerName="nova-api-log" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.495952 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.499702 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.499761 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.499712 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.507998 4747 scope.go:117] "RemoveContainer" containerID="3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684" Dec 05 22:20:35 crc kubenswrapper[4747]: E1205 22:20:35.508405 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684\": container with ID starting with 3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684 not found: ID does not exist" containerID="3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.508529 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684"} err="failed to get container status \"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684\": rpc error: code = NotFound desc = could not find container \"3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684\": container with ID starting with 3d955f43f99d6d80493d10a9d354c17bfb53159b25ec1e954f8559936f1a2684 not found: ID does not exist" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.508786 4747 scope.go:117] "RemoveContainer" containerID="0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0" Dec 05 22:20:35 crc kubenswrapper[4747]: E1205 22:20:35.509146 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0\": container with ID starting with 0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0 not found: ID does not exist" containerID="0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.509166 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0"} err="failed to get container status \"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0\": rpc error: code = NotFound desc = could not find container \"0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0\": container with ID starting with 0c7c2e608993161279edb386d10d4b6d85bf8aee31eed3651bb7f8db8f5977f0 not found: ID does not exist" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.513414 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524265 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524348 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524371 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524432 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5sr\" (UniqueName: \"kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.524467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.625977 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626079 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626130 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5sr\" (UniqueName: \"kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626405 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.626826 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.630503 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.631134 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.634070 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.635759 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.644645 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5sr\" (UniqueName: \"kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr\") pod \"nova-api-0\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.820267 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 22:20:35 crc kubenswrapper[4747]: I1205 22:20:35.860578 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e38ff9-ee98-40ad-9353-ea7d416b9389" path="/var/lib/kubelet/pods/52e38ff9-ee98-40ad-9353-ea7d416b9389/volumes" Dec 05 22:20:36 crc kubenswrapper[4747]: I1205 22:20:36.287055 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 22:20:36 crc kubenswrapper[4747]: I1205 22:20:36.433101 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerStarted","Data":"a8e50c1820f3b8951531ef214e397397a8108506a821ba355a12720a228dd109"} Dec 05 22:20:37 crc kubenswrapper[4747]: I1205 22:20:37.450942 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerStarted","Data":"e8b9709959f131e7489b8cd63e9a52cd8973922d616ca8018fa61fe10fefa266"} Dec 05 22:20:37 crc kubenswrapper[4747]: I1205 22:20:37.451345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerStarted","Data":"3213ee748a4ef5c991530470015f6722cab8f4de8d1ad6487e1358fe41b171f0"} Dec 05 22:20:37 crc kubenswrapper[4747]: I1205 22:20:37.489669 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.489640919 podStartE2EDuration="2.489640919s" podCreationTimestamp="2025-12-05 22:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:20:37.480259467 +0000 UTC m=+5907.947566975" watchObservedRunningTime="2025-12-05 22:20:37.489640919 +0000 UTC m=+5907.956948447" Dec 05 22:20:37 crc kubenswrapper[4747]: I1205 22:20:37.842302 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:20:37 crc kubenswrapper[4747]: E1205 22:20:37.842549 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:20:38 crc kubenswrapper[4747]: I1205 22:20:38.843812 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:20:38 crc kubenswrapper[4747]: I1205 22:20:38.911948 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:20:38 crc kubenswrapper[4747]: I1205 22:20:38.912270 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="dnsmasq-dns" containerID="cri-o://605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23" gracePeriod=10 Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.355828 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.472084 4747 generic.go:334] "Generic (PLEG): container finished" podID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerID="605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23" exitCode=0 Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.472123 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" event={"ID":"f7b89dd1-f703-45f9-8a1a-47263f26a301","Type":"ContainerDied","Data":"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23"} Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.472147 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" event={"ID":"f7b89dd1-f703-45f9-8a1a-47263f26a301","Type":"ContainerDied","Data":"de538b2e091e47149ccb0fff740cec6e0fb2307a8920ff4f58096f86bdaf167a"} Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.472146 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b45bd99-2zfkl" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.472175 4747 scope.go:117] "RemoveContainer" containerID="605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.498067 4747 scope.go:117] "RemoveContainer" containerID="adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.507226 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc\") pod \"f7b89dd1-f703-45f9-8a1a-47263f26a301\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.507304 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tzr\" (UniqueName: \"kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr\") pod \"f7b89dd1-f703-45f9-8a1a-47263f26a301\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.507455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb\") pod \"f7b89dd1-f703-45f9-8a1a-47263f26a301\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.507552 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb\") pod \"f7b89dd1-f703-45f9-8a1a-47263f26a301\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.507576 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config\") pod \"f7b89dd1-f703-45f9-8a1a-47263f26a301\" (UID: \"f7b89dd1-f703-45f9-8a1a-47263f26a301\") " Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.512959 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr" (OuterVolumeSpecName: "kube-api-access-d4tzr") pod "f7b89dd1-f703-45f9-8a1a-47263f26a301" (UID: "f7b89dd1-f703-45f9-8a1a-47263f26a301"). InnerVolumeSpecName "kube-api-access-d4tzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.525683 4747 scope.go:117] "RemoveContainer" containerID="605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23" Dec 05 22:20:39 crc kubenswrapper[4747]: E1205 22:20:39.531315 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23\": container with ID starting with 605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23 not found: ID does not exist" containerID="605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.531477 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23"} err="failed to get container status \"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23\": rpc error: code = NotFound desc = could not find container \"605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23\": container with ID starting with 605c7ea5265f53e9117b4d118bd0ff140e9967b65ef6a278a5f1d3195fb38c23 not found: ID does not exist" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.531624 4747 scope.go:117] "RemoveContainer" containerID="adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769" Dec 05 22:20:39 crc kubenswrapper[4747]: E1205 22:20:39.532144 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769\": container with ID starting with adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769 not found: ID does not exist" containerID="adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.532236 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769"} err="failed to get container status \"adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769\": rpc error: code = NotFound desc = could not find container \"adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769\": container with ID starting with adb75134d7ee93ec78dc90415839a2b47272457c62fe2ce2a7b17601e84a5769 not found: ID does not exist" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.554625 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7b89dd1-f703-45f9-8a1a-47263f26a301" (UID: "f7b89dd1-f703-45f9-8a1a-47263f26a301"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.554780 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7b89dd1-f703-45f9-8a1a-47263f26a301" (UID: "f7b89dd1-f703-45f9-8a1a-47263f26a301"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.560019 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7b89dd1-f703-45f9-8a1a-47263f26a301" (UID: "f7b89dd1-f703-45f9-8a1a-47263f26a301"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.562042 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config" (OuterVolumeSpecName: "config") pod "f7b89dd1-f703-45f9-8a1a-47263f26a301" (UID: "f7b89dd1-f703-45f9-8a1a-47263f26a301"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.610052 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.610090 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.610103 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.610115 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b89dd1-f703-45f9-8a1a-47263f26a301-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.610128 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4tzr\" (UniqueName: \"kubernetes.io/projected/f7b89dd1-f703-45f9-8a1a-47263f26a301-kube-api-access-d4tzr\") on node \"crc\" DevicePath \"\"" Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.806149 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.818263 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869b45bd99-2zfkl"] Dec 05 22:20:39 crc kubenswrapper[4747]: I1205 22:20:39.853896 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" path="/var/lib/kubelet/pods/f7b89dd1-f703-45f9-8a1a-47263f26a301/volumes" Dec 05 22:20:45 crc kubenswrapper[4747]: I1205 22:20:45.821981 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:20:45 crc kubenswrapper[4747]: I1205 22:20:45.824005 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 22:20:46 crc kubenswrapper[4747]: I1205 22:20:46.843755 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.93:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:46 crc kubenswrapper[4747]: I1205 22:20:46.843808 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.93:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 22:20:51 crc kubenswrapper[4747]: I1205 22:20:51.840002 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:20:51 crc kubenswrapper[4747]: E1205 22:20:51.840639 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.831425 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.834238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.835356 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.835765 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.857913 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 22:20:55 crc kubenswrapper[4747]: I1205 22:20:55.857983 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 22:21:03 crc kubenswrapper[4747]: I1205 22:21:03.840377 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:21:03 crc kubenswrapper[4747]: E1205 22:21:03.841320 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:21:14 crc kubenswrapper[4747]: I1205 22:21:14.839797 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:21:14 crc kubenswrapper[4747]: E1205 22:21:14.840599 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.331225 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wl72w"] Dec 05 22:21:17 crc kubenswrapper[4747]: E1205 22:21:17.332168 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="dnsmasq-dns" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.332191 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="dnsmasq-dns" Dec 05 22:21:17 crc kubenswrapper[4747]: E1205 22:21:17.332215 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="init" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.332227 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="init" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.332503 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b89dd1-f703-45f9-8a1a-47263f26a301" containerName="dnsmasq-dns" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.333546 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.336009 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.336106 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.337501 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fs2lb" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.350193 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w"] Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.363162 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pbwjn"] Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.365816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.416844 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pbwjn"] Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529063 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529090 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-run\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529560 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-etc-ovs\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txbdw\" (UniqueName: \"kubernetes.io/projected/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-kube-api-access-txbdw\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529867 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-log\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.529932 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-ovn-controller-tls-certs\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530003 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc64c\" (UniqueName: \"kubernetes.io/projected/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-kube-api-access-bc64c\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530153 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-lib\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530208 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-scripts\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530247 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-combined-ca-bundle\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530406 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-log-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.530491 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-scripts\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-log\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-ovn-controller-tls-certs\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc64c\" (UniqueName: \"kubernetes.io/projected/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-kube-api-access-bc64c\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631891 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-lib\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631917 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-scripts\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-combined-ca-bundle\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.631984 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-log-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-scripts\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632065 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632085 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632107 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-run\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-etc-ovs\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txbdw\" (UniqueName: \"kubernetes.io/projected/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-kube-api-access-txbdw\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632484 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-lib\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632781 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-log\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632954 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.634665 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-var-run\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.634741 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-run-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.634798 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-etc-ovs\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.632956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-var-log-ovn\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.635179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-scripts\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.636614 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-scripts\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.642203 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-ovn-controller-tls-certs\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.642254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-combined-ca-bundle\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.666061 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc64c\" (UniqueName: \"kubernetes.io/projected/be15b94a-18ea-49ed-a209-4e1d6dbd6c62-kube-api-access-bc64c\") pod \"ovn-controller-ovs-pbwjn\" (UID: \"be15b94a-18ea-49ed-a209-4e1d6dbd6c62\") " pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.671533 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txbdw\" (UniqueName: \"kubernetes.io/projected/3c5bd501-1b1a-48ba-b163-a96bd80e07fa-kube-api-access-txbdw\") pod \"ovn-controller-wl72w\" (UID: \"3c5bd501-1b1a-48ba-b163-a96bd80e07fa\") " pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.698183 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w" Dec 05 22:21:17 crc kubenswrapper[4747]: I1205 22:21:17.712807 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.209000 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w"] Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.629493 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pbwjn"] Dec 05 22:21:18 crc kubenswrapper[4747]: W1205 22:21:18.634198 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe15b94a_18ea_49ed_a209_4e1d6dbd6c62.slice/crio-4267bc7e99331d62581e5b18e545a78fbce6fb1254c41ee655c3559191439c40 WatchSource:0}: Error finding container 4267bc7e99331d62581e5b18e545a78fbce6fb1254c41ee655c3559191439c40: Status 404 returned error can't find the container with id 4267bc7e99331d62581e5b18e545a78fbce6fb1254c41ee655c3559191439c40 Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.889504 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8wcg5"] Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.890941 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.897255 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.906864 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8wcg5"] Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.928631 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pbwjn" event={"ID":"be15b94a-18ea-49ed-a209-4e1d6dbd6c62","Type":"ContainerStarted","Data":"4267bc7e99331d62581e5b18e545a78fbce6fb1254c41ee655c3559191439c40"} Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.934478 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w" event={"ID":"3c5bd501-1b1a-48ba-b163-a96bd80e07fa","Type":"ContainerStarted","Data":"a675ad49a1c383a87e925decd1f175a189090635c755a8cabb3881c55a09bd79"} Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.934543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w" event={"ID":"3c5bd501-1b1a-48ba-b163-a96bd80e07fa","Type":"ContainerStarted","Data":"d9d6d6868d55db41c33cbe463e6008b640b365511c22956d907d3ce2aa91895d"} Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.934683 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wl72w" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958102 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovs-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4mfd\" (UniqueName: \"kubernetes.io/projected/9a42efdc-ea18-4d70-b8f5-19336636b4f0-kube-api-access-j4mfd\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958452 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a42efdc-ea18-4d70-b8f5-19336636b4f0-config\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958574 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-combined-ca-bundle\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958748 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovn-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.958810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:18 crc kubenswrapper[4747]: I1205 22:21:18.968696 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wl72w" podStartSLOduration=1.968675689 podStartE2EDuration="1.968675689s" podCreationTimestamp="2025-12-05 22:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:21:18.952382765 +0000 UTC m=+5949.419690263" watchObservedRunningTime="2025-12-05 22:21:18.968675689 +0000 UTC m=+5949.435983167" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.052213 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-afce-account-create-update-psv26"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060449 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovn-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060624 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovs-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060672 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4mfd\" (UniqueName: \"kubernetes.io/projected/9a42efdc-ea18-4d70-b8f5-19336636b4f0-kube-api-access-j4mfd\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060693 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a42efdc-ea18-4d70-b8f5-19336636b4f0-config\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.060761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-combined-ca-bundle\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.061554 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovn-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.061610 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9a42efdc-ea18-4d70-b8f5-19336636b4f0-ovs-rundir\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.061623 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gf5qt"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.062267 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a42efdc-ea18-4d70-b8f5-19336636b4f0-config\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.069205 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-afce-account-create-update-psv26"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.076466 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gf5qt"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.077364 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-combined-ca-bundle\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.078239 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a42efdc-ea18-4d70-b8f5-19336636b4f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.079942 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4mfd\" (UniqueName: \"kubernetes.io/projected/9a42efdc-ea18-4d70-b8f5-19336636b4f0-kube-api-access-j4mfd\") pod \"ovn-controller-metrics-8wcg5\" (UID: \"9a42efdc-ea18-4d70-b8f5-19336636b4f0\") " pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.231977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8wcg5" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.386549 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-4gwlj"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.388765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.431218 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4gwlj"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.569429 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgfz\" (UniqueName: \"kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.569740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.674397 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgfz\" (UniqueName: \"kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.674545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.675324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.689409 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8wcg5"] Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.706840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgfz\" (UniqueName: \"kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz\") pod \"octavia-db-create-4gwlj\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.741755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.897193 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4048a8d-ce85-43ff-89f9-684d376444ad" path="/var/lib/kubelet/pods/b4048a8d-ce85-43ff-89f9-684d376444ad/volumes" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.898424 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb674708-f83c-4126-93bd-f88a3a322002" path="/var/lib/kubelet/pods/bb674708-f83c-4126-93bd-f88a3a322002/volumes" Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.943392 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8wcg5" event={"ID":"9a42efdc-ea18-4d70-b8f5-19336636b4f0","Type":"ContainerStarted","Data":"5b5aa7c1aa1ae6c9cc9c1371901d616eec38b346db3c9a3b09e21cc58d1b6d6a"} Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.944681 4747 generic.go:334] "Generic (PLEG): container finished" podID="be15b94a-18ea-49ed-a209-4e1d6dbd6c62" containerID="3a2a433e4bb100d73e187496998db1d41bbc3127e201b351352b2d63a1c47688" exitCode=0 Dec 05 22:21:19 crc kubenswrapper[4747]: I1205 22:21:19.944776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pbwjn" event={"ID":"be15b94a-18ea-49ed-a209-4e1d6dbd6c62","Type":"ContainerDied","Data":"3a2a433e4bb100d73e187496998db1d41bbc3127e201b351352b2d63a1c47688"} Dec 05 22:21:20 crc kubenswrapper[4747]: W1205 22:21:20.269478 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f287a82_c82e_4c5f_a265_951d168c897e.slice/crio-d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb WatchSource:0}: Error finding container d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb: Status 404 returned error can't find the container with id d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.271106 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4gwlj"] Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.385095 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-0412-account-create-update-db9nh"] Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.390129 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.392978 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.397804 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-0412-account-create-update-db9nh"] Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.494130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.494285 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9h9\" (UniqueName: \"kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.596094 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.596386 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9h9\" (UniqueName: \"kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.596835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.614840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9h9\" (UniqueName: \"kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9\") pod \"octavia-0412-account-create-update-db9nh\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.719294 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.960871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8wcg5" event={"ID":"9a42efdc-ea18-4d70-b8f5-19336636b4f0","Type":"ContainerStarted","Data":"f3d1204527505eba8ccc4319745a5bd2cf6a28c1c7cc74116c77219f14a8994f"} Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.965198 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pbwjn" event={"ID":"be15b94a-18ea-49ed-a209-4e1d6dbd6c62","Type":"ContainerStarted","Data":"ac364d68c63b9daa2891955cab2938c50bd451a5ad2d12c1e5ac0063882f9cc5"} Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.966370 4747 generic.go:334] "Generic (PLEG): container finished" podID="8f287a82-c82e-4c5f-a265-951d168c897e" containerID="c9da9bcf9364f31691ed59c8f688fa8ef2168f318a25c952937fed5d48365a5f" exitCode=0 Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.971034 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.971125 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.971147 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pbwjn" event={"ID":"be15b94a-18ea-49ed-a209-4e1d6dbd6c62","Type":"ContainerStarted","Data":"f67384070c88c7cef9969e84bca0da73af693d8874eb4f9c8ad043c6c0aebcfe"} Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.971217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4gwlj" event={"ID":"8f287a82-c82e-4c5f-a265-951d168c897e","Type":"ContainerDied","Data":"c9da9bcf9364f31691ed59c8f688fa8ef2168f318a25c952937fed5d48365a5f"} Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.971236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4gwlj" event={"ID":"8f287a82-c82e-4c5f-a265-951d168c897e","Type":"ContainerStarted","Data":"d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb"} Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.989966 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8wcg5" podStartSLOduration=2.989946904 podStartE2EDuration="2.989946904s" podCreationTimestamp="2025-12-05 22:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:21:20.977641229 +0000 UTC m=+5951.444948717" watchObservedRunningTime="2025-12-05 22:21:20.989946904 +0000 UTC m=+5951.457254392" Dec 05 22:21:20 crc kubenswrapper[4747]: I1205 22:21:20.997870 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pbwjn" podStartSLOduration=3.99785434 podStartE2EDuration="3.99785434s" podCreationTimestamp="2025-12-05 22:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:21:20.994445216 +0000 UTC m=+5951.461752704" watchObservedRunningTime="2025-12-05 22:21:20.99785434 +0000 UTC m=+5951.465161818" Dec 05 22:21:21 crc kubenswrapper[4747]: I1205 22:21:21.194425 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-0412-account-create-update-db9nh"] Dec 05 22:21:21 crc kubenswrapper[4747]: I1205 22:21:21.982154 4747 generic.go:334] "Generic (PLEG): container finished" podID="c60cb8d4-27a8-4c76-a378-5df868826990" containerID="1fb5ccd6e72466bb36e14143f02a488b1e2483bb05445806abd50baa18259e70" exitCode=0 Dec 05 22:21:21 crc kubenswrapper[4747]: I1205 22:21:21.982352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0412-account-create-update-db9nh" event={"ID":"c60cb8d4-27a8-4c76-a378-5df868826990","Type":"ContainerDied","Data":"1fb5ccd6e72466bb36e14143f02a488b1e2483bb05445806abd50baa18259e70"} Dec 05 22:21:21 crc kubenswrapper[4747]: I1205 22:21:21.982735 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0412-account-create-update-db9nh" event={"ID":"c60cb8d4-27a8-4c76-a378-5df868826990","Type":"ContainerStarted","Data":"27499c4e570fba46d74c7216b2a36ce15185ffda67126617aa468214e6608ab7"} Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.385176 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.536034 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppgfz\" (UniqueName: \"kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz\") pod \"8f287a82-c82e-4c5f-a265-951d168c897e\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.536119 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts\") pod \"8f287a82-c82e-4c5f-a265-951d168c897e\" (UID: \"8f287a82-c82e-4c5f-a265-951d168c897e\") " Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.537305 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f287a82-c82e-4c5f-a265-951d168c897e" (UID: "8f287a82-c82e-4c5f-a265-951d168c897e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.541255 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz" (OuterVolumeSpecName: "kube-api-access-ppgfz") pod "8f287a82-c82e-4c5f-a265-951d168c897e" (UID: "8f287a82-c82e-4c5f-a265-951d168c897e"). InnerVolumeSpecName "kube-api-access-ppgfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.638532 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppgfz\" (UniqueName: \"kubernetes.io/projected/8f287a82-c82e-4c5f-a265-951d168c897e-kube-api-access-ppgfz\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.638920 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f287a82-c82e-4c5f-a265-951d168c897e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.993975 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4gwlj" event={"ID":"8f287a82-c82e-4c5f-a265-951d168c897e","Type":"ContainerDied","Data":"d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb"} Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.994038 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d16ef1edf8069e4e4a5f85939d1de8203809d84ee9c916be58ca9a3aeeea97bb" Dec 05 22:21:22 crc kubenswrapper[4747]: I1205 22:21:22.994573 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4gwlj" Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.342127 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.457030 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts\") pod \"c60cb8d4-27a8-4c76-a378-5df868826990\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.457279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k9h9\" (UniqueName: \"kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9\") pod \"c60cb8d4-27a8-4c76-a378-5df868826990\" (UID: \"c60cb8d4-27a8-4c76-a378-5df868826990\") " Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.457514 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c60cb8d4-27a8-4c76-a378-5df868826990" (UID: "c60cb8d4-27a8-4c76-a378-5df868826990"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.458194 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c60cb8d4-27a8-4c76-a378-5df868826990-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.465157 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9" (OuterVolumeSpecName: "kube-api-access-5k9h9") pod "c60cb8d4-27a8-4c76-a378-5df868826990" (UID: "c60cb8d4-27a8-4c76-a378-5df868826990"). InnerVolumeSpecName "kube-api-access-5k9h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:21:23 crc kubenswrapper[4747]: I1205 22:21:23.560127 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k9h9\" (UniqueName: \"kubernetes.io/projected/c60cb8d4-27a8-4c76-a378-5df868826990-kube-api-access-5k9h9\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:24 crc kubenswrapper[4747]: I1205 22:21:24.007704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-0412-account-create-update-db9nh" event={"ID":"c60cb8d4-27a8-4c76-a378-5df868826990","Type":"ContainerDied","Data":"27499c4e570fba46d74c7216b2a36ce15185ffda67126617aa468214e6608ab7"} Dec 05 22:21:24 crc kubenswrapper[4747]: I1205 22:21:24.007764 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27499c4e570fba46d74c7216b2a36ce15185ffda67126617aa468214e6608ab7" Dec 05 22:21:24 crc kubenswrapper[4747]: I1205 22:21:24.007803 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-0412-account-create-update-db9nh" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.027786 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cvk9x"] Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.038211 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cvk9x"] Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.305342 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-wtgcp"] Dec 05 22:21:26 crc kubenswrapper[4747]: E1205 22:21:26.305828 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f287a82-c82e-4c5f-a265-951d168c897e" containerName="mariadb-database-create" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.305849 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f287a82-c82e-4c5f-a265-951d168c897e" containerName="mariadb-database-create" Dec 05 22:21:26 crc kubenswrapper[4747]: E1205 22:21:26.305895 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60cb8d4-27a8-4c76-a378-5df868826990" containerName="mariadb-account-create-update" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.305903 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60cb8d4-27a8-4c76-a378-5df868826990" containerName="mariadb-account-create-update" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.306152 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f287a82-c82e-4c5f-a265-951d168c897e" containerName="mariadb-database-create" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.306183 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60cb8d4-27a8-4c76-a378-5df868826990" containerName="mariadb-account-create-update" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.306973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.315523 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-wtgcp"] Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.501976 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmt6\" (UniqueName: \"kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.502212 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.604228 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmt6\" (UniqueName: \"kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.604745 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.605705 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.624243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmt6\" (UniqueName: \"kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6\") pod \"octavia-persistence-db-create-wtgcp\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.627281 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.839628 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:21:26 crc kubenswrapper[4747]: E1205 22:21:26.840151 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.956023 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-29e1-account-create-update-dvvqp"] Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.960562 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.967700 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 05 22:21:26 crc kubenswrapper[4747]: I1205 22:21:26.993513 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-29e1-account-create-update-dvvqp"] Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.115334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.115429 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnpwm\" (UniqueName: \"kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.150769 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-wtgcp"] Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.217603 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.217712 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnpwm\" (UniqueName: \"kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.218510 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.244848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnpwm\" (UniqueName: \"kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm\") pod \"octavia-29e1-account-create-update-dvvqp\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.289166 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.760224 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-29e1-account-create-update-dvvqp"] Dec 05 22:21:27 crc kubenswrapper[4747]: W1205 22:21:27.772934 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefeb221c_61d1_4bbf_8ca2_fc3580444bf6.slice/crio-c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99 WatchSource:0}: Error finding container c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99: Status 404 returned error can't find the container with id c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99 Dec 05 22:21:27 crc kubenswrapper[4747]: I1205 22:21:27.855290 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62add4b5-2b83-4753-a553-fd36ec012e1b" path="/var/lib/kubelet/pods/62add4b5-2b83-4753-a553-fd36ec012e1b/volumes" Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.050139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-29e1-account-create-update-dvvqp" event={"ID":"efeb221c-61d1-4bbf-8ca2-fc3580444bf6","Type":"ContainerStarted","Data":"6029c0b7e282df250bbe4edadc6756c1a9bcd960b80f34d693eb406ac19c0104"} Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.050199 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-29e1-account-create-update-dvvqp" event={"ID":"efeb221c-61d1-4bbf-8ca2-fc3580444bf6","Type":"ContainerStarted","Data":"c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99"} Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.052330 4747 generic.go:334] "Generic (PLEG): container finished" podID="c0380903-b6cc-4d0a-8119-ad6579a98860" containerID="b506ff932fd84664bbed4a96b32926250aa79910c577ee60b756651632f1377d" exitCode=0 Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.052369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wtgcp" event={"ID":"c0380903-b6cc-4d0a-8119-ad6579a98860","Type":"ContainerDied","Data":"b506ff932fd84664bbed4a96b32926250aa79910c577ee60b756651632f1377d"} Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.052391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wtgcp" event={"ID":"c0380903-b6cc-4d0a-8119-ad6579a98860","Type":"ContainerStarted","Data":"3dcd1964d67c3398a90fd2b71c58264faa5cc14ea04ccc0e62beb2ae4b4ece23"} Dec 05 22:21:28 crc kubenswrapper[4747]: I1205 22:21:28.090992 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-29e1-account-create-update-dvvqp" podStartSLOduration=2.090969897 podStartE2EDuration="2.090969897s" podCreationTimestamp="2025-12-05 22:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:21:28.075183066 +0000 UTC m=+5958.542490554" watchObservedRunningTime="2025-12-05 22:21:28.090969897 +0000 UTC m=+5958.558277385" Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.065935 4747 generic.go:334] "Generic (PLEG): container finished" podID="efeb221c-61d1-4bbf-8ca2-fc3580444bf6" containerID="6029c0b7e282df250bbe4edadc6756c1a9bcd960b80f34d693eb406ac19c0104" exitCode=0 Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.066956 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-29e1-account-create-update-dvvqp" event={"ID":"efeb221c-61d1-4bbf-8ca2-fc3580444bf6","Type":"ContainerDied","Data":"6029c0b7e282df250bbe4edadc6756c1a9bcd960b80f34d693eb406ac19c0104"} Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.482570 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.661767 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlmt6\" (UniqueName: \"kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6\") pod \"c0380903-b6cc-4d0a-8119-ad6579a98860\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.662045 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts\") pod \"c0380903-b6cc-4d0a-8119-ad6579a98860\" (UID: \"c0380903-b6cc-4d0a-8119-ad6579a98860\") " Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.663150 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0380903-b6cc-4d0a-8119-ad6579a98860" (UID: "c0380903-b6cc-4d0a-8119-ad6579a98860"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.667189 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6" (OuterVolumeSpecName: "kube-api-access-mlmt6") pod "c0380903-b6cc-4d0a-8119-ad6579a98860" (UID: "c0380903-b6cc-4d0a-8119-ad6579a98860"). InnerVolumeSpecName "kube-api-access-mlmt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.765691 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0380903-b6cc-4d0a-8119-ad6579a98860-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:29 crc kubenswrapper[4747]: I1205 22:21:29.765741 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlmt6\" (UniqueName: \"kubernetes.io/projected/c0380903-b6cc-4d0a-8119-ad6579a98860-kube-api-access-mlmt6\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.107358 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wtgcp" event={"ID":"c0380903-b6cc-4d0a-8119-ad6579a98860","Type":"ContainerDied","Data":"3dcd1964d67c3398a90fd2b71c58264faa5cc14ea04ccc0e62beb2ae4b4ece23"} Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.107416 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dcd1964d67c3398a90fd2b71c58264faa5cc14ea04ccc0e62beb2ae4b4ece23" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.107385 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wtgcp" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.486062 4747 scope.go:117] "RemoveContainer" containerID="b1ee109fe7e53a7c9f6c64dbce3e2b385eb261b5d4f6fa8cdfc47ad2cd962a9d" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.522281 4747 scope.go:117] "RemoveContainer" containerID="b43b3b09f682d9f3e640ddac0018992e99a7eb131932256e30dc3fbc82390016" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.540760 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.558273 4747 scope.go:117] "RemoveContainer" containerID="9081f3b6fe644f529bb78a3002403a76a3bdf2a37d7e71bf6a6a5ccd05808621" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.682832 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts\") pod \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.682964 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnpwm\" (UniqueName: \"kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm\") pod \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\" (UID: \"efeb221c-61d1-4bbf-8ca2-fc3580444bf6\") " Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.683567 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efeb221c-61d1-4bbf-8ca2-fc3580444bf6" (UID: "efeb221c-61d1-4bbf-8ca2-fc3580444bf6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.686855 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm" (OuterVolumeSpecName: "kube-api-access-wnpwm") pod "efeb221c-61d1-4bbf-8ca2-fc3580444bf6" (UID: "efeb221c-61d1-4bbf-8ca2-fc3580444bf6"). InnerVolumeSpecName "kube-api-access-wnpwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.786232 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:30 crc kubenswrapper[4747]: I1205 22:21:30.786295 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnpwm\" (UniqueName: \"kubernetes.io/projected/efeb221c-61d1-4bbf-8ca2-fc3580444bf6-kube-api-access-wnpwm\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:31 crc kubenswrapper[4747]: I1205 22:21:31.124748 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-29e1-account-create-update-dvvqp" event={"ID":"efeb221c-61d1-4bbf-8ca2-fc3580444bf6","Type":"ContainerDied","Data":"c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99"} Dec 05 22:21:31 crc kubenswrapper[4747]: I1205 22:21:31.125779 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7fa4cd00ef12da69fcaa719df3c3656193cba015637eede7469ca7065dcae99" Dec 05 22:21:31 crc kubenswrapper[4747]: I1205 22:21:31.124865 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-29e1-account-create-update-dvvqp" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.478008 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:21:32 crc kubenswrapper[4747]: E1205 22:21:32.479786 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efeb221c-61d1-4bbf-8ca2-fc3580444bf6" containerName="mariadb-account-create-update" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.479888 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="efeb221c-61d1-4bbf-8ca2-fc3580444bf6" containerName="mariadb-account-create-update" Dec 05 22:21:32 crc kubenswrapper[4747]: E1205 22:21:32.479973 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0380903-b6cc-4d0a-8119-ad6579a98860" containerName="mariadb-database-create" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.480055 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0380903-b6cc-4d0a-8119-ad6579a98860" containerName="mariadb-database-create" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.480798 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="efeb221c-61d1-4bbf-8ca2-fc3580444bf6" containerName="mariadb-account-create-update" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.480938 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0380903-b6cc-4d0a-8119-ad6579a98860" containerName="mariadb-database-create" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.483063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.495772 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.498886 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.502926 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-clcbh" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.503097 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.503192 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.640931 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.641059 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.641131 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.641277 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.641315 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.641506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.743090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.743356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.743417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.743707 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.744112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.744152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.745522 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.746932 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.751404 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.752386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.752813 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.756616 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle\") pod \"octavia-api-8db457646-qz7wr\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:32 crc kubenswrapper[4747]: I1205 22:21:32.806702 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:33 crc kubenswrapper[4747]: I1205 22:21:33.300233 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:21:33 crc kubenswrapper[4747]: I1205 22:21:33.313178 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:21:34 crc kubenswrapper[4747]: I1205 22:21:34.152981 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerStarted","Data":"e0891bd3103382398bed5eb2ce7c0e4b8087088db5e24c1986b078939cda2df5"} Dec 05 22:21:38 crc kubenswrapper[4747]: I1205 22:21:38.841063 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:21:38 crc kubenswrapper[4747]: E1205 22:21:38.842370 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:21:39 crc kubenswrapper[4747]: I1205 22:21:39.033436 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-czqjs"] Dec 05 22:21:39 crc kubenswrapper[4747]: I1205 22:21:39.043149 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-czqjs"] Dec 05 22:21:39 crc kubenswrapper[4747]: I1205 22:21:39.849951 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5638839-25e0-4009-9cca-59dea6eb0612" path="/var/lib/kubelet/pods/b5638839-25e0-4009-9cca-59dea6eb0612/volumes" Dec 05 22:21:43 crc kubenswrapper[4747]: I1205 22:21:43.252660 4747 generic.go:334] "Generic (PLEG): container finished" podID="99385733-cf00-4ba0-8a0c-327c53de1723" containerID="aa7000b7b5fb725a80131b42e953ec244fa42cb1e69e564022c506bc77d9fe96" exitCode=0 Dec 05 22:21:43 crc kubenswrapper[4747]: I1205 22:21:43.252879 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerDied","Data":"aa7000b7b5fb725a80131b42e953ec244fa42cb1e69e564022c506bc77d9fe96"} Dec 05 22:21:44 crc kubenswrapper[4747]: I1205 22:21:44.266022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerStarted","Data":"870fca14377606443db74bd3cdfd26f5ed2b438558c766552e52c89fcc405c73"} Dec 05 22:21:44 crc kubenswrapper[4747]: I1205 22:21:44.266770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerStarted","Data":"e4506d2e850cea67af73887e7610a7be078a6d6693bcddb08e96f8098b27ec7a"} Dec 05 22:21:44 crc kubenswrapper[4747]: I1205 22:21:44.266793 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:44 crc kubenswrapper[4747]: I1205 22:21:44.266808 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:21:44 crc kubenswrapper[4747]: I1205 22:21:44.304898 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-8db457646-qz7wr" podStartSLOduration=3.521866825 podStartE2EDuration="12.304877314s" podCreationTimestamp="2025-12-05 22:21:32 +0000 UTC" firstStartedPulling="2025-12-05 22:21:33.312897228 +0000 UTC m=+5963.780204716" lastFinishedPulling="2025-12-05 22:21:42.095907717 +0000 UTC m=+5972.563215205" observedRunningTime="2025-12-05 22:21:44.29179219 +0000 UTC m=+5974.759099708" watchObservedRunningTime="2025-12-05 22:21:44.304877314 +0000 UTC m=+5974.772184812" Dec 05 22:21:52 crc kubenswrapper[4747]: I1205 22:21:52.755468 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wl72w" Dec 05 22:21:52 crc kubenswrapper[4747]: I1205 22:21:52.785673 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:52 crc kubenswrapper[4747]: I1205 22:21:52.811411 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pbwjn" Dec 05 22:21:52 crc kubenswrapper[4747]: I1205 22:21:52.840049 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:21:52 crc kubenswrapper[4747]: E1205 22:21:52.840472 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.003139 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wl72w-config-656px"] Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.004444 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.011930 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w-config-656px"] Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.015379 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.089876 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.089947 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.089990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.090010 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjr5v\" (UniqueName: \"kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.090034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.090096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.191689 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.191768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.191803 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjr5v\" (UniqueName: \"kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.191821 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.191842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.192145 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.192278 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.192269 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.192522 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.193808 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.194372 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.218450 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjr5v\" (UniqueName: \"kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v\") pod \"ovn-controller-wl72w-config-656px\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.339489 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:53 crc kubenswrapper[4747]: I1205 22:21:53.822848 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w-config-656px"] Dec 05 22:21:53 crc kubenswrapper[4747]: W1205 22:21:53.826027 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ea75ac_1c17_4334_a06e_af29a7b7a173.slice/crio-68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a WatchSource:0}: Error finding container 68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a: Status 404 returned error can't find the container with id 68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a Dec 05 22:21:54 crc kubenswrapper[4747]: I1205 22:21:54.409012 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-656px" event={"ID":"48ea75ac-1c17-4334-a06e-af29a7b7a173","Type":"ContainerStarted","Data":"fef82e6b857efccab3057e2381e13c9cccc7bf0dd1b7846f13cf564007c84274"} Dec 05 22:21:54 crc kubenswrapper[4747]: I1205 22:21:54.409304 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-656px" event={"ID":"48ea75ac-1c17-4334-a06e-af29a7b7a173","Type":"ContainerStarted","Data":"68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a"} Dec 05 22:21:54 crc kubenswrapper[4747]: I1205 22:21:54.437046 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wl72w-config-656px" podStartSLOduration=2.437024542 podStartE2EDuration="2.437024542s" podCreationTimestamp="2025-12-05 22:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:21:54.431519925 +0000 UTC m=+5984.898827423" watchObservedRunningTime="2025-12-05 22:21:54.437024542 +0000 UTC m=+5984.904332040" Dec 05 22:21:55 crc kubenswrapper[4747]: I1205 22:21:55.426331 4747 generic.go:334] "Generic (PLEG): container finished" podID="48ea75ac-1c17-4334-a06e-af29a7b7a173" containerID="fef82e6b857efccab3057e2381e13c9cccc7bf0dd1b7846f13cf564007c84274" exitCode=0 Dec 05 22:21:55 crc kubenswrapper[4747]: I1205 22:21:55.426420 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-656px" event={"ID":"48ea75ac-1c17-4334-a06e-af29a7b7a173","Type":"ContainerDied","Data":"fef82e6b857efccab3057e2381e13c9cccc7bf0dd1b7846f13cf564007c84274"} Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.850524 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.971814 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.971991 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.971995 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972107 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972117 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972206 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972239 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972266 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjr5v\" (UniqueName: \"kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v\") pod \"48ea75ac-1c17-4334-a06e-af29a7b7a173\" (UID: \"48ea75ac-1c17-4334-a06e-af29a7b7a173\") " Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972335 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run" (OuterVolumeSpecName: "var-run") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972850 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972876 4747 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.972888 4747 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ea75ac-1c17-4334-a06e-af29a7b7a173-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.973460 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.973851 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts" (OuterVolumeSpecName: "scripts") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:21:56 crc kubenswrapper[4747]: I1205 22:21:56.979456 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v" (OuterVolumeSpecName: "kube-api-access-kjr5v") pod "48ea75ac-1c17-4334-a06e-af29a7b7a173" (UID: "48ea75ac-1c17-4334-a06e-af29a7b7a173"). InnerVolumeSpecName "kube-api-access-kjr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.075056 4747 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.075099 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ea75ac-1c17-4334-a06e-af29a7b7a173-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.075114 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjr5v\" (UniqueName: \"kubernetes.io/projected/48ea75ac-1c17-4334-a06e-af29a7b7a173-kube-api-access-kjr5v\") on node \"crc\" DevicePath \"\"" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.450314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-656px" event={"ID":"48ea75ac-1c17-4334-a06e-af29a7b7a173","Type":"ContainerDied","Data":"68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a"} Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.450357 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c635c935e8c99c55604c90c1ab2d8311c877f4eeb777257f62797468017b3a" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.450420 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-656px" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.543891 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wl72w-config-656px"] Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.552333 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wl72w-config-656px"] Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.659369 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wl72w-config-jrvrw"] Dec 05 22:21:57 crc kubenswrapper[4747]: E1205 22:21:57.659892 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ea75ac-1c17-4334-a06e-af29a7b7a173" containerName="ovn-config" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.659914 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ea75ac-1c17-4334-a06e-af29a7b7a173" containerName="ovn-config" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.660140 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ea75ac-1c17-4334-a06e-af29a7b7a173" containerName="ovn-config" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.660930 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.663357 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.671677 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w-config-jrvrw"] Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688267 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688347 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xkv\" (UniqueName: \"kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688416 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688492 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688620 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.688663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.791425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xkv\" (UniqueName: \"kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.791916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.791975 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792043 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792075 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792210 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792303 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.792419 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.793152 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.794040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.808972 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xkv\" (UniqueName: \"kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv\") pod \"ovn-controller-wl72w-config-jrvrw\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.852192 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ea75ac-1c17-4334-a06e-af29a7b7a173" path="/var/lib/kubelet/pods/48ea75ac-1c17-4334-a06e-af29a7b7a173/volumes" Dec 05 22:21:57 crc kubenswrapper[4747]: I1205 22:21:57.983744 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:21:58 crc kubenswrapper[4747]: I1205 22:21:58.497383 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wl72w-config-jrvrw"] Dec 05 22:21:59 crc kubenswrapper[4747]: I1205 22:21:59.474098 4747 generic.go:334] "Generic (PLEG): container finished" podID="74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" containerID="eed5babd274f0abea116e74a4abe9e7044743d3049a79e92554731a28de88584" exitCode=0 Dec 05 22:21:59 crc kubenswrapper[4747]: I1205 22:21:59.474150 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-jrvrw" event={"ID":"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec","Type":"ContainerDied","Data":"eed5babd274f0abea116e74a4abe9e7044743d3049a79e92554731a28de88584"} Dec 05 22:21:59 crc kubenswrapper[4747]: I1205 22:21:59.474185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-jrvrw" event={"ID":"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec","Type":"ContainerStarted","Data":"5ccf11aef8f77fa84352aa86ee9e05013404d3ead3708de136012baea8932233"} Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.895516 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.973354 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-pmwf5"] Dec 05 22:22:00 crc kubenswrapper[4747]: E1205 22:22:00.973933 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" containerName="ovn-config" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.973952 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" containerName="ovn-config" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.974125 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" containerName="ovn-config" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.975092 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.977031 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.979819 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.979861 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 05 22:22:00 crc kubenswrapper[4747]: I1205 22:22:00.998275 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-pmwf5"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.072935 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.072983 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9xkv\" (UniqueName: \"kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073051 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073131 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073211 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073256 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run\") pod \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\" (UID: \"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec\") " Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073265 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073273 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run" (OuterVolumeSpecName: "var-run") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073714 4747 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073732 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.073742 4747 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.074008 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.074572 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts" (OuterVolumeSpecName: "scripts") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.079868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv" (OuterVolumeSpecName: "kube-api-access-t9xkv") pod "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" (UID: "74e0aadd-7e9a-4b65-ba49-0fd9e3855bec"). InnerVolumeSpecName "kube-api-access-t9xkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175211 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data-merged\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175265 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/61ad6ca6-026d-4ef6-9a8d-c414dd904933-hm-ports\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175320 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175427 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-scripts\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175538 4747 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175559 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.175572 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9xkv\" (UniqueName: \"kubernetes.io/projected/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec-kube-api-access-t9xkv\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.276809 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/61ad6ca6-026d-4ef6-9a8d-c414dd904933-hm-ports\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.276892 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.276981 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-scripts\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.277050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data-merged\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.277726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data-merged\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.278325 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/61ad6ca6-026d-4ef6-9a8d-c414dd904933-hm-ports\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.281724 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-scripts\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.282291 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61ad6ca6-026d-4ef6-9a8d-c414dd904933-config-data\") pod \"octavia-rsyslog-pmwf5\" (UID: \"61ad6ca6-026d-4ef6-9a8d-c414dd904933\") " pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.300028 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.515063 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wl72w-config-jrvrw" event={"ID":"74e0aadd-7e9a-4b65-ba49-0fd9e3855bec","Type":"ContainerDied","Data":"5ccf11aef8f77fa84352aa86ee9e05013404d3ead3708de136012baea8932233"} Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.515361 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccf11aef8f77fa84352aa86ee9e05013404d3ead3708de136012baea8932233" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.515379 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wl72w-config-jrvrw" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.721730 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.723649 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.725895 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.734489 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.870792 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-pmwf5"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.894172 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.894546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.968015 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wl72w-config-jrvrw"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.982753 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wl72w-config-jrvrw"] Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.996412 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.996532 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:01 crc kubenswrapper[4747]: I1205 22:22:01.996997 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:02 crc kubenswrapper[4747]: I1205 22:22:02.004012 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config\") pod \"octavia-image-upload-56c9f55b99-szm75\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:02 crc kubenswrapper[4747]: I1205 22:22:02.078710 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:02 crc kubenswrapper[4747]: I1205 22:22:02.524618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pmwf5" event={"ID":"61ad6ca6-026d-4ef6-9a8d-c414dd904933","Type":"ContainerStarted","Data":"2978c31980e463699f409150082ab929e2c31e4ad8a216b27bdba294fbcf93e3"} Dec 05 22:22:02 crc kubenswrapper[4747]: I1205 22:22:02.624008 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.396189 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-62hzt"] Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.398246 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.404012 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.417944 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-62hzt"] Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.533872 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.534005 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.534029 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.534228 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.540056 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerStarted","Data":"c9e4c99efa2bcccf0355ab9bfaacf100f7ae0639aa193319fc7073746957758e"} Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.636299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.636361 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.636419 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.636557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.637244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.643276 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.648287 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.669357 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle\") pod \"octavia-db-sync-62hzt\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.728730 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:03 crc kubenswrapper[4747]: I1205 22:22:03.876555 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e0aadd-7e9a-4b65-ba49-0fd9e3855bec" path="/var/lib/kubelet/pods/74e0aadd-7e9a-4b65-ba49-0fd9e3855bec/volumes" Dec 05 22:22:04 crc kubenswrapper[4747]: I1205 22:22:04.347829 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-62hzt"] Dec 05 22:22:04 crc kubenswrapper[4747]: I1205 22:22:04.555368 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pmwf5" event={"ID":"61ad6ca6-026d-4ef6-9a8d-c414dd904933","Type":"ContainerStarted","Data":"569645dbb9db7a6225ab5561661acdbcd30e4e7515acba9052be1fb3472294db"} Dec 05 22:22:04 crc kubenswrapper[4747]: I1205 22:22:04.840076 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:22:04 crc kubenswrapper[4747]: E1205 22:22:04.840355 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:22:05 crc kubenswrapper[4747]: I1205 22:22:05.567206 4747 generic.go:334] "Generic (PLEG): container finished" podID="dff68c28-2112-446e-a942-d101afc19a5d" containerID="60e2eef4e7bd86021778eaf92f04fb73bc28126d400b83e391c83ec83fa7ba69" exitCode=0 Dec 05 22:22:05 crc kubenswrapper[4747]: I1205 22:22:05.567279 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-62hzt" event={"ID":"dff68c28-2112-446e-a942-d101afc19a5d","Type":"ContainerDied","Data":"60e2eef4e7bd86021778eaf92f04fb73bc28126d400b83e391c83ec83fa7ba69"} Dec 05 22:22:05 crc kubenswrapper[4747]: I1205 22:22:05.567738 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-62hzt" event={"ID":"dff68c28-2112-446e-a942-d101afc19a5d","Type":"ContainerStarted","Data":"97f399a89d6dded8fd6dd1e97a53c689013bbeaace78faff3a6b3f58bf30610e"} Dec 05 22:22:06 crc kubenswrapper[4747]: I1205 22:22:06.599293 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-62hzt" event={"ID":"dff68c28-2112-446e-a942-d101afc19a5d","Type":"ContainerStarted","Data":"6b85761d555398543baa07a7bb86ed1f1ce161d1a478a8087bda0dc20e15c6f3"} Dec 05 22:22:06 crc kubenswrapper[4747]: I1205 22:22:06.603501 4747 generic.go:334] "Generic (PLEG): container finished" podID="61ad6ca6-026d-4ef6-9a8d-c414dd904933" containerID="569645dbb9db7a6225ab5561661acdbcd30e4e7515acba9052be1fb3472294db" exitCode=0 Dec 05 22:22:06 crc kubenswrapper[4747]: I1205 22:22:06.603545 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pmwf5" event={"ID":"61ad6ca6-026d-4ef6-9a8d-c414dd904933","Type":"ContainerDied","Data":"569645dbb9db7a6225ab5561661acdbcd30e4e7515acba9052be1fb3472294db"} Dec 05 22:22:06 crc kubenswrapper[4747]: I1205 22:22:06.631482 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-62hzt" podStartSLOduration=3.631462689 podStartE2EDuration="3.631462689s" podCreationTimestamp="2025-12-05 22:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:22:06.616922029 +0000 UTC m=+5997.084229517" watchObservedRunningTime="2025-12-05 22:22:06.631462689 +0000 UTC m=+5997.098770177" Dec 05 22:22:07 crc kubenswrapper[4747]: I1205 22:22:07.288538 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:22:07 crc kubenswrapper[4747]: I1205 22:22:07.382104 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:22:08 crc kubenswrapper[4747]: I1205 22:22:08.620548 4747 generic.go:334] "Generic (PLEG): container finished" podID="dff68c28-2112-446e-a942-d101afc19a5d" containerID="6b85761d555398543baa07a7bb86ed1f1ce161d1a478a8087bda0dc20e15c6f3" exitCode=0 Dec 05 22:22:08 crc kubenswrapper[4747]: I1205 22:22:08.620612 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-62hzt" event={"ID":"dff68c28-2112-446e-a942-d101afc19a5d","Type":"ContainerDied","Data":"6b85761d555398543baa07a7bb86ed1f1ce161d1a478a8087bda0dc20e15c6f3"} Dec 05 22:22:08 crc kubenswrapper[4747]: I1205 22:22:08.625057 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pmwf5" event={"ID":"61ad6ca6-026d-4ef6-9a8d-c414dd904933","Type":"ContainerStarted","Data":"aa5a02f4e1f515bf2e98b270f08fe5babc2ffe6b85c5bedba6d151c851a9bc9e"} Dec 05 22:22:08 crc kubenswrapper[4747]: I1205 22:22:08.625554 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:08 crc kubenswrapper[4747]: I1205 22:22:08.660130 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-pmwf5" podStartSLOduration=2.5513032669999998 podStartE2EDuration="8.660115618s" podCreationTimestamp="2025-12-05 22:22:00 +0000 UTC" firstStartedPulling="2025-12-05 22:22:01.880459001 +0000 UTC m=+5992.347766489" lastFinishedPulling="2025-12-05 22:22:07.989271352 +0000 UTC m=+5998.456578840" observedRunningTime="2025-12-05 22:22:08.655088084 +0000 UTC m=+5999.122395572" watchObservedRunningTime="2025-12-05 22:22:08.660115618 +0000 UTC m=+5999.127423106" Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.828855 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.980083 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data\") pod \"dff68c28-2112-446e-a942-d101afc19a5d\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.980170 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged\") pod \"dff68c28-2112-446e-a942-d101afc19a5d\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.980200 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts\") pod \"dff68c28-2112-446e-a942-d101afc19a5d\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.980366 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle\") pod \"dff68c28-2112-446e-a942-d101afc19a5d\" (UID: \"dff68c28-2112-446e-a942-d101afc19a5d\") " Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.985670 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data" (OuterVolumeSpecName: "config-data") pod "dff68c28-2112-446e-a942-d101afc19a5d" (UID: "dff68c28-2112-446e-a942-d101afc19a5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:14 crc kubenswrapper[4747]: I1205 22:22:14.985936 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts" (OuterVolumeSpecName: "scripts") pod "dff68c28-2112-446e-a942-d101afc19a5d" (UID: "dff68c28-2112-446e-a942-d101afc19a5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.005569 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "dff68c28-2112-446e-a942-d101afc19a5d" (UID: "dff68c28-2112-446e-a942-d101afc19a5d"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.020949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dff68c28-2112-446e-a942-d101afc19a5d" (UID: "dff68c28-2112-446e-a942-d101afc19a5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.082802 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.082857 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dff68c28-2112-446e-a942-d101afc19a5d-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.082871 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.082885 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff68c28-2112-446e-a942-d101afc19a5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.698966 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-62hzt" event={"ID":"dff68c28-2112-446e-a942-d101afc19a5d","Type":"ContainerDied","Data":"97f399a89d6dded8fd6dd1e97a53c689013bbeaace78faff3a6b3f58bf30610e"} Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.699437 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f399a89d6dded8fd6dd1e97a53c689013bbeaace78faff3a6b3f58bf30610e" Dec 05 22:22:15 crc kubenswrapper[4747]: I1205 22:22:15.699022 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-62hzt" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.355464 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-pmwf5" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.501899 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-67b9f9b8db-jj6ds"] Dec 05 22:22:16 crc kubenswrapper[4747]: E1205 22:22:16.502480 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff68c28-2112-446e-a942-d101afc19a5d" containerName="init" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.502509 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff68c28-2112-446e-a942-d101afc19a5d" containerName="init" Dec 05 22:22:16 crc kubenswrapper[4747]: E1205 22:22:16.502565 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff68c28-2112-446e-a942-d101afc19a5d" containerName="octavia-db-sync" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.502582 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff68c28-2112-446e-a942-d101afc19a5d" containerName="octavia-db-sync" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.502909 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff68c28-2112-446e-a942-d101afc19a5d" containerName="octavia-db-sync" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.505107 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.508022 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.508265 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.538232 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-67b9f9b8db-jj6ds"] Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631548 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-scripts\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631675 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-ovndb-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631712 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-public-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631752 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-config-data\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-combined-ca-bundle\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.631986 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-internal-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.632143 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-config-data-merged\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.632231 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-octavia-run\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.710822 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerStarted","Data":"7c6115c9a9392aa7be38f6750431e1245bda2cabf850f433bb5ad1404f43a3a9"} Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734173 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-config-data-merged\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734293 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-octavia-run\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734497 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-scripts\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-ovndb-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734798 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-public-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.734948 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-config-data\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.735022 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-combined-ca-bundle\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.735093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-internal-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.738494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-octavia-run\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.739113 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/116ee497-2944-4847-89ae-c4c14828dfa2-config-data-merged\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.743431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-config-data\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.745942 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-scripts\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.746087 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-ovndb-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.747014 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-combined-ca-bundle\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.747550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-internal-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.755452 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ee497-2944-4847-89ae-c4c14828dfa2-public-tls-certs\") pod \"octavia-api-67b9f9b8db-jj6ds\" (UID: \"116ee497-2944-4847-89ae-c4c14828dfa2\") " pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:16 crc kubenswrapper[4747]: I1205 22:22:16.825095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:17 crc kubenswrapper[4747]: W1205 22:22:17.326287 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116ee497_2944_4847_89ae_c4c14828dfa2.slice/crio-098fbffd954b41cbabdd8e2e7be94e353c3ad650d84d875b8d159e263fd8dd4c WatchSource:0}: Error finding container 098fbffd954b41cbabdd8e2e7be94e353c3ad650d84d875b8d159e263fd8dd4c: Status 404 returned error can't find the container with id 098fbffd954b41cbabdd8e2e7be94e353c3ad650d84d875b8d159e263fd8dd4c Dec 05 22:22:17 crc kubenswrapper[4747]: I1205 22:22:17.334801 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-67b9f9b8db-jj6ds"] Dec 05 22:22:17 crc kubenswrapper[4747]: I1205 22:22:17.727506 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67b9f9b8db-jj6ds" event={"ID":"116ee497-2944-4847-89ae-c4c14828dfa2","Type":"ContainerStarted","Data":"098fbffd954b41cbabdd8e2e7be94e353c3ad650d84d875b8d159e263fd8dd4c"} Dec 05 22:22:17 crc kubenswrapper[4747]: I1205 22:22:17.839929 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:22:19 crc kubenswrapper[4747]: I1205 22:22:19.752569 4747 generic.go:334] "Generic (PLEG): container finished" podID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerID="7c6115c9a9392aa7be38f6750431e1245bda2cabf850f433bb5ad1404f43a3a9" exitCode=0 Dec 05 22:22:19 crc kubenswrapper[4747]: I1205 22:22:19.752722 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerDied","Data":"7c6115c9a9392aa7be38f6750431e1245bda2cabf850f433bb5ad1404f43a3a9"} Dec 05 22:22:20 crc kubenswrapper[4747]: I1205 22:22:20.764917 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerStarted","Data":"d392e4ab49c72ab25dbf41d31b17ed5c03f944b0ad45b3f2127fe6fa3dbc9c69"} Dec 05 22:22:20 crc kubenswrapper[4747]: I1205 22:22:20.791321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921"} Dec 05 22:22:20 crc kubenswrapper[4747]: I1205 22:22:20.791769 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-szm75" podStartSLOduration=6.115988287 podStartE2EDuration="19.79175402s" podCreationTimestamp="2025-12-05 22:22:01 +0000 UTC" firstStartedPulling="2025-12-05 22:22:02.636103378 +0000 UTC m=+5993.103410866" lastFinishedPulling="2025-12-05 22:22:16.311869081 +0000 UTC m=+6006.779176599" observedRunningTime="2025-12-05 22:22:20.77967409 +0000 UTC m=+6011.246981598" watchObservedRunningTime="2025-12-05 22:22:20.79175402 +0000 UTC m=+6011.259061508" Dec 05 22:22:20 crc kubenswrapper[4747]: I1205 22:22:20.806069 4747 generic.go:334] "Generic (PLEG): container finished" podID="116ee497-2944-4847-89ae-c4c14828dfa2" containerID="c60575b7968b8d7471b768ea16a135f76a7e7571254e3fa60b80be8e37407a7b" exitCode=0 Dec 05 22:22:20 crc kubenswrapper[4747]: I1205 22:22:20.806115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67b9f9b8db-jj6ds" event={"ID":"116ee497-2944-4847-89ae-c4c14828dfa2","Type":"ContainerDied","Data":"c60575b7968b8d7471b768ea16a135f76a7e7571254e3fa60b80be8e37407a7b"} Dec 05 22:22:21 crc kubenswrapper[4747]: I1205 22:22:21.822146 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67b9f9b8db-jj6ds" event={"ID":"116ee497-2944-4847-89ae-c4c14828dfa2","Type":"ContainerStarted","Data":"8242c169418ea7d762d47031b34d947da79a227768ffe0cef79bcd2e22017b49"} Dec 05 22:22:21 crc kubenswrapper[4747]: I1205 22:22:21.822975 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67b9f9b8db-jj6ds" event={"ID":"116ee497-2944-4847-89ae-c4c14828dfa2","Type":"ContainerStarted","Data":"90fd619bc38facc335d350b91a0d851dc45f43dd17afc15e93407bc5ef1dd181"} Dec 05 22:22:21 crc kubenswrapper[4747]: I1205 22:22:21.825190 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:21 crc kubenswrapper[4747]: I1205 22:22:21.825267 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:30 crc kubenswrapper[4747]: I1205 22:22:30.682032 4747 scope.go:117] "RemoveContainer" containerID="cb063a1338d2fd5004661f21a5dabdc316bb0bc7d295752f985634260faefd47" Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.600382 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.641500 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-67b9f9b8db-jj6ds" podStartSLOduration=20.641478092 podStartE2EDuration="20.641478092s" podCreationTimestamp="2025-12-05 22:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:22:21.851856014 +0000 UTC m=+6012.319163502" watchObservedRunningTime="2025-12-05 22:22:36.641478092 +0000 UTC m=+6027.108785590" Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.675966 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-67b9f9b8db-jj6ds" Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.766807 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.767032 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-8db457646-qz7wr" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api" containerID="cri-o://e4506d2e850cea67af73887e7610a7be078a6d6693bcddb08e96f8098b27ec7a" gracePeriod=30 Dec 05 22:22:36 crc kubenswrapper[4747]: I1205 22:22:36.767474 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-8db457646-qz7wr" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api-provider-agent" containerID="cri-o://870fca14377606443db74bd3cdfd26f5ed2b438558c766552e52c89fcc405c73" gracePeriod=30 Dec 05 22:22:38 crc kubenswrapper[4747]: I1205 22:22:38.010868 4747 generic.go:334] "Generic (PLEG): container finished" podID="99385733-cf00-4ba0-8a0c-327c53de1723" containerID="870fca14377606443db74bd3cdfd26f5ed2b438558c766552e52c89fcc405c73" exitCode=0 Dec 05 22:22:38 crc kubenswrapper[4747]: I1205 22:22:38.011146 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerDied","Data":"870fca14377606443db74bd3cdfd26f5ed2b438558c766552e52c89fcc405c73"} Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.031916 4747 generic.go:334] "Generic (PLEG): container finished" podID="99385733-cf00-4ba0-8a0c-327c53de1723" containerID="e4506d2e850cea67af73887e7610a7be078a6d6693bcddb08e96f8098b27ec7a" exitCode=0 Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.032094 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerDied","Data":"e4506d2e850cea67af73887e7610a7be078a6d6693bcddb08e96f8098b27ec7a"} Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.376395 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399284 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399357 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399380 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399467 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399500 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.399622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts\") pod \"99385733-cf00-4ba0-8a0c-327c53de1723\" (UID: \"99385733-cf00-4ba0-8a0c-327c53de1723\") " Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.412486 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.459768 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts" (OuterVolumeSpecName: "scripts") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.459805 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data" (OuterVolumeSpecName: "config-data") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.501082 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.501120 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.501134 4747 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-octavia-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.509432 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.524758 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.602417 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.602446 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/99385733-cf00-4ba0-8a0c-327c53de1723-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.602849 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "99385733-cf00-4ba0-8a0c-327c53de1723" (UID: "99385733-cf00-4ba0-8a0c-327c53de1723"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:40 crc kubenswrapper[4747]: I1205 22:22:40.703650 4747 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99385733-cf00-4ba0-8a0c-327c53de1723-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.051149 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-8db457646-qz7wr" event={"ID":"99385733-cf00-4ba0-8a0c-327c53de1723","Type":"ContainerDied","Data":"e0891bd3103382398bed5eb2ce7c0e4b8087088db5e24c1986b078939cda2df5"} Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.051649 4747 scope.go:117] "RemoveContainer" containerID="870fca14377606443db74bd3cdfd26f5ed2b438558c766552e52c89fcc405c73" Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.051361 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-8db457646-qz7wr" Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.109239 4747 scope.go:117] "RemoveContainer" containerID="e4506d2e850cea67af73887e7610a7be078a6d6693bcddb08e96f8098b27ec7a" Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.114094 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.122244 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-8db457646-qz7wr"] Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.134035 4747 scope.go:117] "RemoveContainer" containerID="aa7000b7b5fb725a80131b42e953ec244fa42cb1e69e564022c506bc77d9fe96" Dec 05 22:22:41 crc kubenswrapper[4747]: I1205 22:22:41.851798 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" path="/var/lib/kubelet/pods/99385733-cf00-4ba0-8a0c-327c53de1723/volumes" Dec 05 22:22:47 crc kubenswrapper[4747]: I1205 22:22:47.876776 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:47 crc kubenswrapper[4747]: I1205 22:22:47.877320 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-56c9f55b99-szm75" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="octavia-amphora-httpd" containerID="cri-o://d392e4ab49c72ab25dbf41d31b17ed5c03f944b0ad45b3f2127fe6fa3dbc9c69" gracePeriod=30 Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.134577 4747 generic.go:334] "Generic (PLEG): container finished" podID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerID="d392e4ab49c72ab25dbf41d31b17ed5c03f944b0ad45b3f2127fe6fa3dbc9c69" exitCode=0 Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.134622 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerDied","Data":"d392e4ab49c72ab25dbf41d31b17ed5c03f944b0ad45b3f2127fe6fa3dbc9c69"} Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.385441 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.569413 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image\") pod \"e68593e1-23ab-43ab-94b8-d39510aaa81f\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.569698 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config\") pod \"e68593e1-23ab-43ab-94b8-d39510aaa81f\" (UID: \"e68593e1-23ab-43ab-94b8-d39510aaa81f\") " Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.595604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e68593e1-23ab-43ab-94b8-d39510aaa81f" (UID: "e68593e1-23ab-43ab-94b8-d39510aaa81f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.626101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "e68593e1-23ab-43ab-94b8-d39510aaa81f" (UID: "e68593e1-23ab-43ab-94b8-d39510aaa81f"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.673837 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e68593e1-23ab-43ab-94b8-d39510aaa81f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:48 crc kubenswrapper[4747]: I1205 22:22:48.673870 4747 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e68593e1-23ab-43ab-94b8-d39510aaa81f-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.147442 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-szm75" event={"ID":"e68593e1-23ab-43ab-94b8-d39510aaa81f","Type":"ContainerDied","Data":"c9e4c99efa2bcccf0355ab9bfaacf100f7ae0639aa193319fc7073746957758e"} Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.147844 4747 scope.go:117] "RemoveContainer" containerID="d392e4ab49c72ab25dbf41d31b17ed5c03f944b0ad45b3f2127fe6fa3dbc9c69" Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.147677 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-szm75" Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.174908 4747 scope.go:117] "RemoveContainer" containerID="7c6115c9a9392aa7be38f6750431e1245bda2cabf850f433bb5ad1404f43a3a9" Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.178105 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.187951 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-szm75"] Dec 05 22:22:49 crc kubenswrapper[4747]: I1205 22:22:49.856450 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" path="/var/lib/kubelet/pods/e68593e1-23ab-43ab-94b8-d39510aaa81f/volumes" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.288387 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-4rzpg"] Dec 05 22:22:56 crc kubenswrapper[4747]: E1205 22:22:56.290015 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290048 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api" Dec 05 22:22:56 crc kubenswrapper[4747]: E1205 22:22:56.290085 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api-provider-agent" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290103 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api-provider-agent" Dec 05 22:22:56 crc kubenswrapper[4747]: E1205 22:22:56.290152 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="init" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290170 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="init" Dec 05 22:22:56 crc kubenswrapper[4747]: E1205 22:22:56.290225 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="octavia-amphora-httpd" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290264 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="octavia-amphora-httpd" Dec 05 22:22:56 crc kubenswrapper[4747]: E1205 22:22:56.290303 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="init" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290318 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="init" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290845 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68593e1-23ab-43ab-94b8-d39510aaa81f" containerName="octavia-amphora-httpd" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290898 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api-provider-agent" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.290949 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="99385733-cf00-4ba0-8a0c-327c53de1723" containerName="octavia-api" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.293427 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.299799 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-4rzpg"] Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.300118 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.333315 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-amphora-image\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.333952 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-httpd-config\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.435392 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-amphora-image\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.435470 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-httpd-config\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.436154 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-amphora-image\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.440864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5ed0900-293f-40e2-b0ba-54ba92a0d04c-httpd-config\") pod \"octavia-image-upload-56c9f55b99-4rzpg\" (UID: \"d5ed0900-293f-40e2-b0ba-54ba92a0d04c\") " pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:56 crc kubenswrapper[4747]: I1205 22:22:56.630148 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" Dec 05 22:22:57 crc kubenswrapper[4747]: I1205 22:22:57.142032 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-4rzpg"] Dec 05 22:22:57 crc kubenswrapper[4747]: I1205 22:22:57.243122 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" event={"ID":"d5ed0900-293f-40e2-b0ba-54ba92a0d04c","Type":"ContainerStarted","Data":"1117e2671d91ff9a9fa69bfd841f89be3e2121b3e80ed230961ffce65b4a6f2f"} Dec 05 22:22:58 crc kubenswrapper[4747]: I1205 22:22:58.254522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" event={"ID":"d5ed0900-293f-40e2-b0ba-54ba92a0d04c","Type":"ContainerStarted","Data":"dc88c4a29001af7f86ee04a2a16fd77b99ac4bfaf2cca62c618e197928f3a538"} Dec 05 22:22:59 crc kubenswrapper[4747]: I1205 22:22:59.267093 4747 generic.go:334] "Generic (PLEG): container finished" podID="d5ed0900-293f-40e2-b0ba-54ba92a0d04c" containerID="dc88c4a29001af7f86ee04a2a16fd77b99ac4bfaf2cca62c618e197928f3a538" exitCode=0 Dec 05 22:22:59 crc kubenswrapper[4747]: I1205 22:22:59.267341 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" event={"ID":"d5ed0900-293f-40e2-b0ba-54ba92a0d04c","Type":"ContainerDied","Data":"dc88c4a29001af7f86ee04a2a16fd77b99ac4bfaf2cca62c618e197928f3a538"} Dec 05 22:23:00 crc kubenswrapper[4747]: I1205 22:23:00.282535 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" event={"ID":"d5ed0900-293f-40e2-b0ba-54ba92a0d04c","Type":"ContainerStarted","Data":"538c28578fea87e564e17c153e9242d03c2b3681641dce4b89dce9d2ef32a5a8"} Dec 05 22:23:00 crc kubenswrapper[4747]: I1205 22:23:00.319694 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-4rzpg" podStartSLOduration=3.8739309779999997 podStartE2EDuration="4.319672075s" podCreationTimestamp="2025-12-05 22:22:56 +0000 UTC" firstStartedPulling="2025-12-05 22:22:57.143716462 +0000 UTC m=+6047.611023950" lastFinishedPulling="2025-12-05 22:22:57.589457519 +0000 UTC m=+6048.056765047" observedRunningTime="2025-12-05 22:23:00.307122284 +0000 UTC m=+6050.774429792" watchObservedRunningTime="2025-12-05 22:23:00.319672075 +0000 UTC m=+6050.786979583" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.254717 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-j547l"] Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.257538 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.262241 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.262695 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.263571 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.272763 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-j547l"] Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354550 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-combined-ca-bundle\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-amphora-certs\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354706 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7aa65be4-2b09-4c31-97fc-a220541d4009-hm-ports\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354761 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-scripts\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data-merged\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.354823 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.457727 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-combined-ca-bundle\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.457844 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-amphora-certs\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.457940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7aa65be4-2b09-4c31-97fc-a220541d4009-hm-ports\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.458077 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-scripts\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.458152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data-merged\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.458260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.460348 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data-merged\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.460993 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7aa65be4-2b09-4c31-97fc-a220541d4009-hm-ports\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.465946 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-config-data\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.466073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-amphora-certs\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.466339 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-combined-ca-bundle\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.479463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa65be4-2b09-4c31-97fc-a220541d4009-scripts\") pod \"octavia-healthmanager-j547l\" (UID: \"7aa65be4-2b09-4c31-97fc-a220541d4009\") " pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:06 crc kubenswrapper[4747]: I1205 22:23:06.588177 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.156193 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-j547l"] Dec 05 22:23:07 crc kubenswrapper[4747]: W1205 22:23:07.158090 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa65be4_2b09_4c31_97fc_a220541d4009.slice/crio-b42830ba5ac41f0a38f3c854c8117c0b6ab0f2a25ce01afeca3e221117288851 WatchSource:0}: Error finding container b42830ba5ac41f0a38f3c854c8117c0b6ab0f2a25ce01afeca3e221117288851: Status 404 returned error can't find the container with id b42830ba5ac41f0a38f3c854c8117c0b6ab0f2a25ce01afeca3e221117288851 Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.395376 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-j547l" event={"ID":"7aa65be4-2b09-4c31-97fc-a220541d4009","Type":"ContainerStarted","Data":"b42830ba5ac41f0a38f3c854c8117c0b6ab0f2a25ce01afeca3e221117288851"} Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.963667 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-4wx4f"] Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.965824 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.968211 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.968380 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 05 22:23:07 crc kubenswrapper[4747]: I1205 22:23:07.974513 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-4wx4f"] Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002307 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-amphora-certs\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002381 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/20f32969-4fa1-4d35-84b3-becaa9c774ea-hm-ports\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002422 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-combined-ca-bundle\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002466 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data-merged\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002573 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.002638 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-scripts\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.104259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.104364 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-scripts\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.104523 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-amphora-certs\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.105168 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/20f32969-4fa1-4d35-84b3-becaa9c774ea-hm-ports\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.105247 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-combined-ca-bundle\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.105315 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data-merged\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.106362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data-merged\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.106675 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/20f32969-4fa1-4d35-84b3-becaa9c774ea-hm-ports\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.110760 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-amphora-certs\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.112042 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-config-data\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.113614 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-combined-ca-bundle\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.113768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f32969-4fa1-4d35-84b3-becaa9c774ea-scripts\") pod \"octavia-housekeeping-4wx4f\" (UID: \"20f32969-4fa1-4d35-84b3-becaa9c774ea\") " pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.282692 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.408457 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-j547l" event={"ID":"7aa65be4-2b09-4c31-97fc-a220541d4009","Type":"ContainerStarted","Data":"16f47a536b51ba0db6e27b9f5fd84fc40199fa1eee621e9a261f18a58fd06eaf"} Dec 05 22:23:08 crc kubenswrapper[4747]: I1205 22:23:08.920078 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-4wx4f"] Dec 05 22:23:08 crc kubenswrapper[4747]: W1205 22:23:08.922820 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f32969_4fa1_4d35_84b3_becaa9c774ea.slice/crio-ffb10736fefbfa5e192b87b1288daa404ae1bc3130460e65408cd007d3091511 WatchSource:0}: Error finding container ffb10736fefbfa5e192b87b1288daa404ae1bc3130460e65408cd007d3091511: Status 404 returned error can't find the container with id ffb10736fefbfa5e192b87b1288daa404ae1bc3130460e65408cd007d3091511 Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.317790 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-d7nzb"] Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.319452 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.328922 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-amphora-certs\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.328985 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-scripts\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.329035 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.329096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-combined-ca-bundle\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.329147 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-hm-ports\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.329266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data-merged\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.330291 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.330606 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.350599 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-d7nzb"] Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.419083 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-4wx4f" event={"ID":"20f32969-4fa1-4d35-84b3-becaa9c774ea","Type":"ContainerStarted","Data":"ffb10736fefbfa5e192b87b1288daa404ae1bc3130460e65408cd007d3091511"} Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.429881 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.429932 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-combined-ca-bundle\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.429977 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-hm-ports\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.430076 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data-merged\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.430110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-amphora-certs\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.430136 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-scripts\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.431533 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data-merged\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.432486 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-hm-ports\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.436499 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-amphora-certs\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.436779 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-scripts\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.441781 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-combined-ca-bundle\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.442251 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb28b4bc-7d39-4d5f-af3e-d855503b94f3-config-data\") pod \"octavia-worker-d7nzb\" (UID: \"fb28b4bc-7d39-4d5f-af3e-d855503b94f3\") " pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:09 crc kubenswrapper[4747]: I1205 22:23:09.647419 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:10 crc kubenswrapper[4747]: I1205 22:23:10.215515 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-d7nzb"] Dec 05 22:23:10 crc kubenswrapper[4747]: I1205 22:23:10.427235 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-d7nzb" event={"ID":"fb28b4bc-7d39-4d5f-af3e-d855503b94f3","Type":"ContainerStarted","Data":"9f85ad273727894d2930f448d87742c6f354716cf3e3505c4177d82d4688524b"} Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.446931 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.451275 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.461218 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.492970 4747 generic.go:334] "Generic (PLEG): container finished" podID="7aa65be4-2b09-4c31-97fc-a220541d4009" containerID="16f47a536b51ba0db6e27b9f5fd84fc40199fa1eee621e9a261f18a58fd06eaf" exitCode=0 Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.493059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-j547l" event={"ID":"7aa65be4-2b09-4c31-97fc-a220541d4009","Type":"ContainerDied","Data":"16f47a536b51ba0db6e27b9f5fd84fc40199fa1eee621e9a261f18a58fd06eaf"} Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.571132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmbs\" (UniqueName: \"kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.571475 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.571597 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.672965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.673147 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmbs\" (UniqueName: \"kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.673194 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.674708 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.675003 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.693369 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmbs\" (UniqueName: \"kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs\") pod \"community-operators-kbckn\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:11 crc kubenswrapper[4747]: I1205 22:23:11.777356 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:13 crc kubenswrapper[4747]: I1205 22:23:13.622232 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:13 crc kubenswrapper[4747]: W1205 22:23:13.626284 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cdaa0bb_8e27_4894_9cda_fde30f72c183.slice/crio-0fdb7bb117883446932b681638b36627283dea26013abcfe148f5eaf2a1c0d13 WatchSource:0}: Error finding container 0fdb7bb117883446932b681638b36627283dea26013abcfe148f5eaf2a1c0d13: Status 404 returned error can't find the container with id 0fdb7bb117883446932b681638b36627283dea26013abcfe148f5eaf2a1c0d13 Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.537911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-j547l" event={"ID":"7aa65be4-2b09-4c31-97fc-a220541d4009","Type":"ContainerStarted","Data":"7279f9f8caf8312ec1f8677a4e87d2e57648985e3a6b4a1d862cb55fab0fcb7c"} Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.538835 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.542699 4747 generic.go:334] "Generic (PLEG): container finished" podID="20f32969-4fa1-4d35-84b3-becaa9c774ea" containerID="1841a5bb3a76e0e0fc3f14f288e339ba5b72f66c3eb53190180b916eb7f9c80d" exitCode=0 Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.542776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-4wx4f" event={"ID":"20f32969-4fa1-4d35-84b3-becaa9c774ea","Type":"ContainerDied","Data":"1841a5bb3a76e0e0fc3f14f288e339ba5b72f66c3eb53190180b916eb7f9c80d"} Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.546096 4747 generic.go:334] "Generic (PLEG): container finished" podID="fb28b4bc-7d39-4d5f-af3e-d855503b94f3" containerID="5ea69792c8b2b4037168913bdd28143da51664710256f4f38315894042bfbca9" exitCode=0 Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.546173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-d7nzb" event={"ID":"fb28b4bc-7d39-4d5f-af3e-d855503b94f3","Type":"ContainerDied","Data":"5ea69792c8b2b4037168913bdd28143da51664710256f4f38315894042bfbca9"} Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.552450 4747 generic.go:334] "Generic (PLEG): container finished" podID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerID="261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678" exitCode=0 Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.552510 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerDied","Data":"261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678"} Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.552548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerStarted","Data":"0fdb7bb117883446932b681638b36627283dea26013abcfe148f5eaf2a1c0d13"} Dec 05 22:23:14 crc kubenswrapper[4747]: I1205 22:23:14.563565 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-j547l" podStartSLOduration=8.563546027 podStartE2EDuration="8.563546027s" podCreationTimestamp="2025-12-05 22:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:23:14.556381119 +0000 UTC m=+6065.023688607" watchObservedRunningTime="2025-12-05 22:23:14.563546027 +0000 UTC m=+6065.030853515" Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.563766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-d7nzb" event={"ID":"fb28b4bc-7d39-4d5f-af3e-d855503b94f3","Type":"ContainerStarted","Data":"c44c4bcf420ed1d5e4abf73dd21ce9039bf0041eea35c2f001b54a2a8eb33131"} Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.564308 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.567217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerStarted","Data":"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da"} Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.569369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-4wx4f" event={"ID":"20f32969-4fa1-4d35-84b3-becaa9c774ea","Type":"ContainerStarted","Data":"1181812f74ad9427ea461cb11e43d3e9bfdb5ab753cba17abc4ae449480d284d"} Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.583132 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-d7nzb" podStartSLOduration=3.689371626 podStartE2EDuration="6.583111876s" podCreationTimestamp="2025-12-05 22:23:09 +0000 UTC" firstStartedPulling="2025-12-05 22:23:10.214578641 +0000 UTC m=+6060.681886129" lastFinishedPulling="2025-12-05 22:23:13.108318891 +0000 UTC m=+6063.575626379" observedRunningTime="2025-12-05 22:23:15.580142303 +0000 UTC m=+6066.047449821" watchObservedRunningTime="2025-12-05 22:23:15.583111876 +0000 UTC m=+6066.050419364" Dec 05 22:23:15 crc kubenswrapper[4747]: I1205 22:23:15.604132 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-4wx4f" podStartSLOduration=4.420778947 podStartE2EDuration="8.604116417s" podCreationTimestamp="2025-12-05 22:23:07 +0000 UTC" firstStartedPulling="2025-12-05 22:23:08.924791376 +0000 UTC m=+6059.392098874" lastFinishedPulling="2025-12-05 22:23:13.108128856 +0000 UTC m=+6063.575436344" observedRunningTime="2025-12-05 22:23:15.598029856 +0000 UTC m=+6066.065337344" watchObservedRunningTime="2025-12-05 22:23:15.604116417 +0000 UTC m=+6066.071423905" Dec 05 22:23:16 crc kubenswrapper[4747]: I1205 22:23:16.579462 4747 generic.go:334] "Generic (PLEG): container finished" podID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerID="d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da" exitCode=0 Dec 05 22:23:16 crc kubenswrapper[4747]: I1205 22:23:16.580437 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerDied","Data":"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da"} Dec 05 22:23:16 crc kubenswrapper[4747]: I1205 22:23:16.580904 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:17 crc kubenswrapper[4747]: I1205 22:23:17.594129 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerStarted","Data":"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0"} Dec 05 22:23:17 crc kubenswrapper[4747]: I1205 22:23:17.621617 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbckn" podStartSLOduration=4.147215672 podStartE2EDuration="6.621598258s" podCreationTimestamp="2025-12-05 22:23:11 +0000 UTC" firstStartedPulling="2025-12-05 22:23:14.55599157 +0000 UTC m=+6065.023299078" lastFinishedPulling="2025-12-05 22:23:17.030374166 +0000 UTC m=+6067.497681664" observedRunningTime="2025-12-05 22:23:17.611319893 +0000 UTC m=+6068.078627391" watchObservedRunningTime="2025-12-05 22:23:17.621598258 +0000 UTC m=+6068.088905746" Dec 05 22:23:21 crc kubenswrapper[4747]: I1205 22:23:21.647837 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-j547l" Dec 05 22:23:21 crc kubenswrapper[4747]: I1205 22:23:21.777514 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:21 crc kubenswrapper[4747]: I1205 22:23:21.779077 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:21 crc kubenswrapper[4747]: I1205 22:23:21.853781 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:22 crc kubenswrapper[4747]: I1205 22:23:22.726657 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:23 crc kubenswrapper[4747]: I1205 22:23:23.323443 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-4wx4f" Dec 05 22:23:24 crc kubenswrapper[4747]: I1205 22:23:24.683274 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-d7nzb" Dec 05 22:23:26 crc kubenswrapper[4747]: I1205 22:23:26.822175 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:26 crc kubenswrapper[4747]: I1205 22:23:26.822557 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbckn" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="registry-server" containerID="cri-o://d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0" gracePeriod=2 Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.438884 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.531702 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content\") pod \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.531928 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities\") pod \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.532059 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmbs\" (UniqueName: \"kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs\") pod \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\" (UID: \"6cdaa0bb-8e27-4894-9cda-fde30f72c183\") " Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.532827 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities" (OuterVolumeSpecName: "utilities") pod "6cdaa0bb-8e27-4894-9cda-fde30f72c183" (UID: "6cdaa0bb-8e27-4894-9cda-fde30f72c183"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.540483 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs" (OuterVolumeSpecName: "kube-api-access-fsmbs") pod "6cdaa0bb-8e27-4894-9cda-fde30f72c183" (UID: "6cdaa0bb-8e27-4894-9cda-fde30f72c183"). InnerVolumeSpecName "kube-api-access-fsmbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.585663 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cdaa0bb-8e27-4894-9cda-fde30f72c183" (UID: "6cdaa0bb-8e27-4894-9cda-fde30f72c183"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.635033 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmbs\" (UniqueName: \"kubernetes.io/projected/6cdaa0bb-8e27-4894-9cda-fde30f72c183-kube-api-access-fsmbs\") on node \"crc\" DevicePath \"\"" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.635112 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.635140 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cdaa0bb-8e27-4894-9cda-fde30f72c183-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.702435 4747 generic.go:334] "Generic (PLEG): container finished" podID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerID="d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0" exitCode=0 Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.702519 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerDied","Data":"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0"} Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.702545 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbckn" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.702565 4747 scope.go:117] "RemoveContainer" containerID="d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.702554 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbckn" event={"ID":"6cdaa0bb-8e27-4894-9cda-fde30f72c183","Type":"ContainerDied","Data":"0fdb7bb117883446932b681638b36627283dea26013abcfe148f5eaf2a1c0d13"} Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.742916 4747 scope.go:117] "RemoveContainer" containerID="d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.756508 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.770873 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbckn"] Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.779721 4747 scope.go:117] "RemoveContainer" containerID="261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.834963 4747 scope.go:117] "RemoveContainer" containerID="d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0" Dec 05 22:23:27 crc kubenswrapper[4747]: E1205 22:23:27.835497 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0\": container with ID starting with d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0 not found: ID does not exist" containerID="d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.835554 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0"} err="failed to get container status \"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0\": rpc error: code = NotFound desc = could not find container \"d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0\": container with ID starting with d166e10776cafeade8d873a6b047101f411ba26e74342f3466c85d31c8517eb0 not found: ID does not exist" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.835741 4747 scope.go:117] "RemoveContainer" containerID="d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da" Dec 05 22:23:27 crc kubenswrapper[4747]: E1205 22:23:27.836250 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da\": container with ID starting with d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da not found: ID does not exist" containerID="d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.836297 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da"} err="failed to get container status \"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da\": rpc error: code = NotFound desc = could not find container \"d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da\": container with ID starting with d84f8ccaa10506e117cd049445998ab0339a525eabfacab083bbebf5d2d489da not found: ID does not exist" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.836329 4747 scope.go:117] "RemoveContainer" containerID="261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678" Dec 05 22:23:27 crc kubenswrapper[4747]: E1205 22:23:27.836708 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678\": container with ID starting with 261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678 not found: ID does not exist" containerID="261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.836742 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678"} err="failed to get container status \"261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678\": rpc error: code = NotFound desc = could not find container \"261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678\": container with ID starting with 261d1507c5ab2a91f0aaf8e36b5d2a1959d52f643c8142ea7bcadd0a33b16678 not found: ID does not exist" Dec 05 22:23:27 crc kubenswrapper[4747]: I1205 22:23:27.854224 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" path="/var/lib/kubelet/pods/6cdaa0bb-8e27-4894-9cda-fde30f72c183/volumes" Dec 05 22:23:54 crc kubenswrapper[4747]: I1205 22:23:54.066274 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6t6lp"] Dec 05 22:23:54 crc kubenswrapper[4747]: I1205 22:23:54.080982 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6t6lp"] Dec 05 22:23:55 crc kubenswrapper[4747]: I1205 22:23:55.051030 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f0b7-account-create-update-pjq57"] Dec 05 22:23:55 crc kubenswrapper[4747]: I1205 22:23:55.064296 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f0b7-account-create-update-pjq57"] Dec 05 22:23:55 crc kubenswrapper[4747]: I1205 22:23:55.865745 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498693e6-74d0-4f8c-bfe7-4c1b1d4167f7" path="/var/lib/kubelet/pods/498693e6-74d0-4f8c-bfe7-4c1b1d4167f7/volumes" Dec 05 22:23:55 crc kubenswrapper[4747]: I1205 22:23:55.869227 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c15daa-ad48-4db6-bf6d-99de93a266f6" path="/var/lib/kubelet/pods/a1c15daa-ad48-4db6-bf6d-99de93a266f6/volumes" Dec 05 22:24:02 crc kubenswrapper[4747]: I1205 22:24:02.035213 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6crtc"] Dec 05 22:24:02 crc kubenswrapper[4747]: I1205 22:24:02.049425 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6crtc"] Dec 05 22:24:03 crc kubenswrapper[4747]: I1205 22:24:03.862366 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e875d4-1d9d-4245-90fc-900eae487b29" path="/var/lib/kubelet/pods/81e875d4-1d9d-4245-90fc-900eae487b29/volumes" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.641113 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:24:18 crc kubenswrapper[4747]: E1205 22:24:18.643361 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="extract-utilities" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.643386 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="extract-utilities" Dec 05 22:24:18 crc kubenswrapper[4747]: E1205 22:24:18.643473 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="extract-content" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.643483 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="extract-content" Dec 05 22:24:18 crc kubenswrapper[4747]: E1205 22:24:18.643562 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="registry-server" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.643571 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="registry-server" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.644338 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cdaa0bb-8e27-4894-9cda-fde30f72c183" containerName="registry-server" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.645822 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.650442 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.650683 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.651176 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.651228 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rzr69" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.666895 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.725644 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.725858 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-log" containerID="cri-o://f66a336b0a6ac3bad02b0a3fca4beb05022ad4585d93c67c49d4e5b2bd0df5e4" gracePeriod=30 Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.726248 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-httpd" containerID="cri-o://773c06d7432bd5bb33c2966d73800183d2665781d36ab2f70880ea1c644c220b" gracePeriod=30 Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.761762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.761961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2t6n\" (UniqueName: \"kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.762141 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.762223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.762273 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.784922 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.786455 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.805722 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.840809 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.840991 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-log" containerID="cri-o://a995c30578be67c81ce2c11d12d51e2655a163a8034402e517cca58f37735866" gracePeriod=30 Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.841351 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-httpd" containerID="cri-o://5f2f6e50caf6b0c63bfc6deb165a46f6301ee7209445d4f62bad5c2da65fa3bb" gracePeriod=30 Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864185 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864466 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2t6n\" (UniqueName: \"kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864498 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864520 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864631 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.864717 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9gh\" (UniqueName: \"kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.865051 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.865184 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.865668 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.865952 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.871078 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.885857 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2t6n\" (UniqueName: \"kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n\") pod \"horizon-77cf976bc9-b9lfc\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.967020 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.967062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.967117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.967197 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9gh\" (UniqueName: \"kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.967240 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.968368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.968744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.968832 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.972010 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.982540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9gh\" (UniqueName: \"kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh\") pod \"horizon-5c69d469cc-bwxm6\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:18 crc kubenswrapper[4747]: I1205 22:24:18.983714 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.100916 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.264330 4747 generic.go:334] "Generic (PLEG): container finished" podID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerID="a995c30578be67c81ce2c11d12d51e2655a163a8034402e517cca58f37735866" exitCode=143 Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.264389 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerDied","Data":"a995c30578be67c81ce2c11d12d51e2655a163a8034402e517cca58f37735866"} Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.266624 4747 generic.go:334] "Generic (PLEG): container finished" podID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerID="f66a336b0a6ac3bad02b0a3fca4beb05022ad4585d93c67c49d4e5b2bd0df5e4" exitCode=143 Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.266702 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerDied","Data":"f66a336b0a6ac3bad02b0a3fca4beb05022ad4585d93c67c49d4e5b2bd0df5e4"} Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.435694 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:24:19 crc kubenswrapper[4747]: W1205 22:24:19.580405 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1624e91_42d7_4cf9_812f_231dab1d161e.slice/crio-0d10bc99f10d381e0a1dc37f27c2908c57203f429105ec0b2498ea1335a92d05 WatchSource:0}: Error finding container 0d10bc99f10d381e0a1dc37f27c2908c57203f429105ec0b2498ea1335a92d05: Status 404 returned error can't find the container with id 0d10bc99f10d381e0a1dc37f27c2908c57203f429105ec0b2498ea1335a92d05 Dec 05 22:24:19 crc kubenswrapper[4747]: I1205 22:24:19.594915 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:24:20 crc kubenswrapper[4747]: I1205 22:24:20.281115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerStarted","Data":"0d10bc99f10d381e0a1dc37f27c2908c57203f429105ec0b2498ea1335a92d05"} Dec 05 22:24:20 crc kubenswrapper[4747]: I1205 22:24:20.285681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerStarted","Data":"76a088c318fc474f282abbce29f71e84232fc09a1669bc03b188133356d76fc7"} Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.081378 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.106128 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.108778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.113852 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.120984 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229004 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229047 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229092 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264wt\" (UniqueName: \"kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229122 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.229301 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.241550 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.251090 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.252890 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.263508 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331300 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331355 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331381 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331534 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331562 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331683 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331711 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331737 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkdn\" (UniqueName: \"kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-264wt\" (UniqueName: \"kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331802 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331838 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.331859 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.332469 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.332628 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.333190 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.340292 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.340335 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.340342 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.348384 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-264wt\" (UniqueName: \"kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt\") pod \"horizon-65fbbfd684-8946q\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434146 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434390 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkdn\" (UniqueName: \"kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434433 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434476 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.434514 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.435960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.435994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.444156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.446091 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.457101 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.462099 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkdn\" (UniqueName: \"kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn\") pod \"horizon-6d6bfb9dd6-f9xzd\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.475559 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.580105 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:21 crc kubenswrapper[4747]: I1205 22:24:21.986374 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:24:22 crc kubenswrapper[4747]: W1205 22:24:22.061110 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5252ace2_c979_4b87_80c6_efefb677ac17.slice/crio-f7c13c3865471e86c215252b966c6784f4e46bb9ff64f1a358dde7ba06508202 WatchSource:0}: Error finding container f7c13c3865471e86c215252b966c6784f4e46bb9ff64f1a358dde7ba06508202: Status 404 returned error can't find the container with id f7c13c3865471e86c215252b966c6784f4e46bb9ff64f1a358dde7ba06508202 Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.079111 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:24:22 crc kubenswrapper[4747]: W1205 22:24:22.116893 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48607b00_5383_46cb_9a0e_50f1b1b2cb6c.slice/crio-f254cf4b67254eadf16eebc84673b58722dfe28443cc5d394160ce6939c6ed13 WatchSource:0}: Error finding container f254cf4b67254eadf16eebc84673b58722dfe28443cc5d394160ce6939c6ed13: Status 404 returned error can't find the container with id f254cf4b67254eadf16eebc84673b58722dfe28443cc5d394160ce6939c6ed13 Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.303548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerStarted","Data":"f254cf4b67254eadf16eebc84673b58722dfe28443cc5d394160ce6939c6ed13"} Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.307486 4747 generic.go:334] "Generic (PLEG): container finished" podID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerID="5f2f6e50caf6b0c63bfc6deb165a46f6301ee7209445d4f62bad5c2da65fa3bb" exitCode=0 Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.307571 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerDied","Data":"5f2f6e50caf6b0c63bfc6deb165a46f6301ee7209445d4f62bad5c2da65fa3bb"} Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.313793 4747 generic.go:334] "Generic (PLEG): container finished" podID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerID="773c06d7432bd5bb33c2966d73800183d2665781d36ab2f70880ea1c644c220b" exitCode=0 Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.313863 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerDied","Data":"773c06d7432bd5bb33c2966d73800183d2665781d36ab2f70880ea1c644c220b"} Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.315786 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerStarted","Data":"f7c13c3865471e86c215252b966c6784f4e46bb9ff64f1a358dde7ba06508202"} Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.559847 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn9jf\" (UniqueName: \"kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671094 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671128 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671219 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671273 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671434 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.671455 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data\") pod \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\" (UID: \"2371f26a-47e9-40d3-ae77-cd7a4810fee3\") " Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.672168 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.672377 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs" (OuterVolumeSpecName: "logs") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.677068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts" (OuterVolumeSpecName: "scripts") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.686868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf" (OuterVolumeSpecName: "kube-api-access-qn9jf") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "kube-api-access-qn9jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.710986 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.734492 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.756777 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data" (OuterVolumeSpecName: "config-data") pod "2371f26a-47e9-40d3-ae77-cd7a4810fee3" (UID: "2371f26a-47e9-40d3-ae77-cd7a4810fee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775247 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775295 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775305 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775313 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2371f26a-47e9-40d3-ae77-cd7a4810fee3-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775342 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn9jf\" (UniqueName: \"kubernetes.io/projected/2371f26a-47e9-40d3-ae77-cd7a4810fee3-kube-api-access-qn9jf\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775353 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:22 crc kubenswrapper[4747]: I1205 22:24:22.775361 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2371f26a-47e9-40d3-ae77-cd7a4810fee3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.328312 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2371f26a-47e9-40d3-ae77-cd7a4810fee3","Type":"ContainerDied","Data":"55b0ec33de6b756d63825aeddbe0ccb3e4c8bf016b840210a9d456324648f524"} Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.328357 4747 scope.go:117] "RemoveContainer" containerID="773c06d7432bd5bb33c2966d73800183d2665781d36ab2f70880ea1c644c220b" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.328484 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.425313 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.447695 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.461171 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:23 crc kubenswrapper[4747]: E1205 22:24:23.461637 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-httpd" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.461655 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-httpd" Dec 05 22:24:23 crc kubenswrapper[4747]: E1205 22:24:23.461672 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-log" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.461679 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-log" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.461877 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-log" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.461898 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" containerName="glance-httpd" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.464732 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.467914 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.469900 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.470075 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 22:24:23 crc kubenswrapper[4747]: E1205 22:24:23.544097 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2371f26a_47e9_40d3_ae77_cd7a4810fee3.slice/crio-55b0ec33de6b756d63825aeddbe0ccb3e4c8bf016b840210a9d456324648f524\": RecentStats: unable to find data in memory cache]" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.596528 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wqp\" (UniqueName: \"kubernetes.io/projected/fc8984e2-423b-4534-b56c-92b94bcb4155-kube-api-access-c9wqp\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.596630 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.596824 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.596936 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-logs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.596999 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.597074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.597110 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.700967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.701438 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.701478 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.701604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wqp\" (UniqueName: \"kubernetes.io/projected/fc8984e2-423b-4534-b56c-92b94bcb4155-kube-api-access-c9wqp\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.701635 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.701775 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.702227 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-logs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.702626 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.702863 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc8984e2-423b-4534-b56c-92b94bcb4155-logs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.711771 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.711982 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.712691 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.713299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8984e2-423b-4534-b56c-92b94bcb4155-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.725096 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wqp\" (UniqueName: \"kubernetes.io/projected/fc8984e2-423b-4534-b56c-92b94bcb4155-kube-api-access-c9wqp\") pod \"glance-default-external-api-0\" (UID: \"fc8984e2-423b-4534-b56c-92b94bcb4155\") " pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.794912 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 22:24:23 crc kubenswrapper[4747]: I1205 22:24:23.852178 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2371f26a-47e9-40d3-ae77-cd7a4810fee3" path="/var/lib/kubelet/pods/2371f26a-47e9-40d3-ae77-cd7a4810fee3/volumes" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.056065 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d25xx"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.059351 4747 scope.go:117] "RemoveContainer" containerID="f66a336b0a6ac3bad02b0a3fca4beb05022ad4585d93c67c49d4e5b2bd0df5e4" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.071509 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-38fa-account-create-update-s2gtp"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.105543 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-38fa-account-create-update-s2gtp"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.126132 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d25xx"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.207782 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303009 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303060 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303122 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303312 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmsnp\" (UniqueName: \"kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303346 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303387 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs\") pod \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\" (UID: \"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd\") " Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303655 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.303882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs" (OuterVolumeSpecName: "logs") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.305145 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.305195 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.309919 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp" (OuterVolumeSpecName: "kube-api-access-hmsnp") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "kube-api-access-hmsnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.310899 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts" (OuterVolumeSpecName: "scripts") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.404310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54381b7b-e3f0-4729-b0d6-5a6fb055e9bd","Type":"ContainerDied","Data":"fab77c22f6475cabe7a8c0eee8fd75649891ccf3a0e39b506185865f5920cb44"} Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.404347 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.404368 4747 scope.go:117] "RemoveContainer" containerID="5f2f6e50caf6b0c63bfc6deb165a46f6301ee7209445d4f62bad5c2da65fa3bb" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.420263 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmsnp\" (UniqueName: \"kubernetes.io/projected/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-kube-api-access-hmsnp\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.420329 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.485615 4747 scope.go:117] "RemoveContainer" containerID="a995c30578be67c81ce2c11d12d51e2655a163a8034402e517cca58f37735866" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.492457 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.522225 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.601913 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data" (OuterVolumeSpecName: "config-data") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.608536 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" (UID: "54381b7b-e3f0-4729-b0d6-5a6fb055e9bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.627456 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.627676 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.653926 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.756236 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.766965 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.776943 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:28 crc kubenswrapper[4747]: E1205 22:24:28.777426 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-log" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.777450 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-log" Dec 05 22:24:28 crc kubenswrapper[4747]: E1205 22:24:28.777466 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-httpd" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.777475 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-httpd" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.777712 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-httpd" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.777748 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" containerName="glance-log" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.779035 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.786497 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.786568 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.786737 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.947961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-logs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7p72\" (UniqueName: \"kubernetes.io/projected/64efaafd-f823-4a41-ae6c-fd150138ebda-kube-api-access-r7p72\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948191 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948210 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:28 crc kubenswrapper[4747]: I1205 22:24:28.948337 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-logs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050619 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7p72\" (UniqueName: \"kubernetes.io/projected/64efaafd-f823-4a41-ae6c-fd150138ebda-kube-api-access-r7p72\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050669 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.050686 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.051419 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.052571 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64efaafd-f823-4a41-ae6c-fd150138ebda-logs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.054456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.054539 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.055189 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.057398 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64efaafd-f823-4a41-ae6c-fd150138ebda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.067008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7p72\" (UniqueName: \"kubernetes.io/projected/64efaafd-f823-4a41-ae6c-fd150138ebda-kube-api-access-r7p72\") pod \"glance-default-internal-api-0\" (UID: \"64efaafd-f823-4a41-ae6c-fd150138ebda\") " pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.154066 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.425334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerStarted","Data":"58ed172a06884d3738e7531dce126180532bbc46bcb6528cb3720a779da60796"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.425557 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerStarted","Data":"796e2c96a98e8cbcdcce497296b38b9fc18ef173751593982c138006c0e42258"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.429675 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerStarted","Data":"cca07c67efbe86f856864731c3e7e175e1af7b9e35689055628e804307812f13"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.429712 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerStarted","Data":"ecd6b6af989a58fcf969fabaa59cac445b385a5ee906a698bc3c47ac254ebcaf"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.431831 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc8984e2-423b-4534-b56c-92b94bcb4155","Type":"ContainerStarted","Data":"82d731cbedb7160f54d6f6e140e2474c282a2e50fbe931885a7a19d7190a4fe8"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.434227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerStarted","Data":"3bbc013cb090d13319e1833b27ebd7d979d09f7cac81e6a9f4a0fd23dc21099d"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.434245 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerStarted","Data":"cdf4258d83ccf77782222800ca06b83ecab290d012d0dbe45fe8ffbe6497e58b"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.434329 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c69d469cc-bwxm6" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon-log" containerID="cri-o://cdf4258d83ccf77782222800ca06b83ecab290d012d0dbe45fe8ffbe6497e58b" gracePeriod=30 Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.434618 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c69d469cc-bwxm6" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon" containerID="cri-o://3bbc013cb090d13319e1833b27ebd7d979d09f7cac81e6a9f4a0fd23dc21099d" gracePeriod=30 Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.437252 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerStarted","Data":"ec2a331a86ae89e08027ea8c0926de156588edd0f3891883477df5f56fc92e3b"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.437271 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerStarted","Data":"43d95ddf8ddca41f5b219484cd551d03cea0a69ad0a240973bf2c0215e440cad"} Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.437339 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77cf976bc9-b9lfc" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon-log" containerID="cri-o://43d95ddf8ddca41f5b219484cd551d03cea0a69ad0a240973bf2c0215e440cad" gracePeriod=30 Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.437391 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77cf976bc9-b9lfc" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon" containerID="cri-o://ec2a331a86ae89e08027ea8c0926de156588edd0f3891883477df5f56fc92e3b" gracePeriod=30 Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.488058 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-65fbbfd684-8946q" podStartSLOduration=2.3891118479999998 podStartE2EDuration="8.488042646s" podCreationTimestamp="2025-12-05 22:24:21 +0000 UTC" firstStartedPulling="2025-12-05 22:24:22.072687012 +0000 UTC m=+6132.539994500" lastFinishedPulling="2025-12-05 22:24:28.17161781 +0000 UTC m=+6138.638925298" observedRunningTime="2025-12-05 22:24:29.461019176 +0000 UTC m=+6139.928326654" watchObservedRunningTime="2025-12-05 22:24:29.488042646 +0000 UTC m=+6139.955350134" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.496922 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c69d469cc-bwxm6" podStartSLOduration=2.828369162 podStartE2EDuration="11.496906095s" podCreationTimestamp="2025-12-05 22:24:18 +0000 UTC" firstStartedPulling="2025-12-05 22:24:19.583867129 +0000 UTC m=+6130.051174607" lastFinishedPulling="2025-12-05 22:24:28.252404052 +0000 UTC m=+6138.719711540" observedRunningTime="2025-12-05 22:24:29.492873235 +0000 UTC m=+6139.960180723" watchObservedRunningTime="2025-12-05 22:24:29.496906095 +0000 UTC m=+6139.964213583" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.539179 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podStartSLOduration=2.409968316 podStartE2EDuration="8.539159143s" podCreationTimestamp="2025-12-05 22:24:21 +0000 UTC" firstStartedPulling="2025-12-05 22:24:22.123384189 +0000 UTC m=+6132.590691677" lastFinishedPulling="2025-12-05 22:24:28.252575016 +0000 UTC m=+6138.719882504" observedRunningTime="2025-12-05 22:24:29.515817254 +0000 UTC m=+6139.983124742" watchObservedRunningTime="2025-12-05 22:24:29.539159143 +0000 UTC m=+6140.006466631" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.560494 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77cf976bc9-b9lfc" podStartSLOduration=2.829424059 podStartE2EDuration="11.560475211s" podCreationTimestamp="2025-12-05 22:24:18 +0000 UTC" firstStartedPulling="2025-12-05 22:24:19.435898482 +0000 UTC m=+6129.903205970" lastFinishedPulling="2025-12-05 22:24:28.166949634 +0000 UTC m=+6138.634257122" observedRunningTime="2025-12-05 22:24:29.552168045 +0000 UTC m=+6140.019475533" watchObservedRunningTime="2025-12-05 22:24:29.560475211 +0000 UTC m=+6140.027782699" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.748237 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 22:24:29 crc kubenswrapper[4747]: W1205 22:24:29.752436 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efaafd_f823_4a41_ae6c_fd150138ebda.slice/crio-0608675d62ae904ca46d6b7bbdeb7df199eb6a681d017b0746d749f98a30b4ed WatchSource:0}: Error finding container 0608675d62ae904ca46d6b7bbdeb7df199eb6a681d017b0746d749f98a30b4ed: Status 404 returned error can't find the container with id 0608675d62ae904ca46d6b7bbdeb7df199eb6a681d017b0746d749f98a30b4ed Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.854195 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e21ba2-529c-4cca-8819-a8b6e5cdbc05" path="/var/lib/kubelet/pods/41e21ba2-529c-4cca-8819-a8b6e5cdbc05/volumes" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.854851 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54381b7b-e3f0-4729-b0d6-5a6fb055e9bd" path="/var/lib/kubelet/pods/54381b7b-e3f0-4729-b0d6-5a6fb055e9bd/volumes" Dec 05 22:24:29 crc kubenswrapper[4747]: I1205 22:24:29.855632 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdf17de-526f-457d-a16f-ef36b315ba94" path="/var/lib/kubelet/pods/5cdf17de-526f-457d-a16f-ef36b315ba94/volumes" Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.451573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64efaafd-f823-4a41-ae6c-fd150138ebda","Type":"ContainerStarted","Data":"e6484e49cc2c103ba199b0858fc726af26bf9d2aba9fc5e94a0f87ffd71c4f4f"} Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.451643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64efaafd-f823-4a41-ae6c-fd150138ebda","Type":"ContainerStarted","Data":"0608675d62ae904ca46d6b7bbdeb7df199eb6a681d017b0746d749f98a30b4ed"} Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.474636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc8984e2-423b-4534-b56c-92b94bcb4155","Type":"ContainerStarted","Data":"1a603c568427d860d28681233902fc01faef8898f83de292d0aa0d82ee8a655b"} Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.474686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc8984e2-423b-4534-b56c-92b94bcb4155","Type":"ContainerStarted","Data":"91c0983ebfa3569bed16bcb719f32907739c172eec8375c03c8a2584fca3ce9c"} Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.509879 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.509861141 podStartE2EDuration="7.509861141s" podCreationTimestamp="2025-12-05 22:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:24:30.503234297 +0000 UTC m=+6140.970541805" watchObservedRunningTime="2025-12-05 22:24:30.509861141 +0000 UTC m=+6140.977168629" Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.870149 4747 scope.go:117] "RemoveContainer" containerID="3605a048a1629283de8e12903ac60bad0b59b43abe5ccdd63cbd741599ce5ce1" Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.903277 4747 scope.go:117] "RemoveContainer" containerID="7d3a172e63dd2f7ad37bd272427476c676ea65cc7fa822e4ece272edbef8773a" Dec 05 22:24:30 crc kubenswrapper[4747]: I1205 22:24:30.980123 4747 scope.go:117] "RemoveContainer" containerID="0f82f9613d3295dc737704166f084fa237a78f0d9d66a01cfbbc7798e159d966" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.016342 4747 scope.go:117] "RemoveContainer" containerID="d54c581b75a7ed90484a5bda424c12287dea94aad342b043bbcedbbc2807a614" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.065626 4747 scope.go:117] "RemoveContainer" containerID="a6a8d606172a1a1a6c25703dab919a8371ff2cdfbb8828a23f921253b9f04bcc" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.476018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.476087 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.496416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64efaafd-f823-4a41-ae6c-fd150138ebda","Type":"ContainerStarted","Data":"da16a36bf4e4fac72b6866af7a616c9a100927c30b7bdb88c9a9d11d5ea8df40"} Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.519962 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.519944195 podStartE2EDuration="3.519944195s" podCreationTimestamp="2025-12-05 22:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:24:31.514766127 +0000 UTC m=+6141.982073615" watchObservedRunningTime="2025-12-05 22:24:31.519944195 +0000 UTC m=+6141.987251683" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.581379 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:31 crc kubenswrapper[4747]: I1205 22:24:31.581436 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:33 crc kubenswrapper[4747]: I1205 22:24:33.796013 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 22:24:33 crc kubenswrapper[4747]: I1205 22:24:33.796435 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 22:24:33 crc kubenswrapper[4747]: I1205 22:24:33.827575 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 22:24:33 crc kubenswrapper[4747]: I1205 22:24:33.876787 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 22:24:34 crc kubenswrapper[4747]: I1205 22:24:34.520889 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 22:24:34 crc kubenswrapper[4747]: I1205 22:24:34.520927 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.027952 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-skw6n"] Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.038604 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-skw6n"] Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.221508 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.221569 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.811339 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 22:24:36 crc kubenswrapper[4747]: I1205 22:24:36.825002 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 22:24:37 crc kubenswrapper[4747]: I1205 22:24:37.852126 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73166647-5821-41a1-a727-2d22d24927c2" path="/var/lib/kubelet/pods/73166647-5821-41a1-a727-2d22d24927c2/volumes" Dec 05 22:24:38 crc kubenswrapper[4747]: I1205 22:24:38.984188 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.101413 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.155296 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.155354 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.210418 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.210793 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.577147 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:39 crc kubenswrapper[4747]: I1205 22:24:39.577207 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:41 crc kubenswrapper[4747]: I1205 22:24:41.479478 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 05 22:24:41 crc kubenswrapper[4747]: I1205 22:24:41.583349 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Dec 05 22:24:41 crc kubenswrapper[4747]: I1205 22:24:41.615002 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:41 crc kubenswrapper[4747]: I1205 22:24:41.615095 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 22:24:42 crc kubenswrapper[4747]: I1205 22:24:42.175013 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 22:24:53 crc kubenswrapper[4747]: I1205 22:24:53.366645 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:53 crc kubenswrapper[4747]: I1205 22:24:53.392208 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:55 crc kubenswrapper[4747]: I1205 22:24:55.043561 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:24:55 crc kubenswrapper[4747]: I1205 22:24:55.118647 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:24:55 crc kubenswrapper[4747]: I1205 22:24:55.192260 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:24:55 crc kubenswrapper[4747]: I1205 22:24:55.725134 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon-log" containerID="cri-o://796e2c96a98e8cbcdcce497296b38b9fc18ef173751593982c138006c0e42258" gracePeriod=30 Dec 05 22:24:55 crc kubenswrapper[4747]: I1205 22:24:55.725231 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" containerID="cri-o://58ed172a06884d3738e7531dce126180532bbc46bcb6528cb3720a779da60796" gracePeriod=30 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.767594 4747 generic.go:334] "Generic (PLEG): container finished" podID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerID="ec2a331a86ae89e08027ea8c0926de156588edd0f3891883477df5f56fc92e3b" exitCode=137 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.768275 4747 generic.go:334] "Generic (PLEG): container finished" podID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerID="43d95ddf8ddca41f5b219484cd551d03cea0a69ad0a240973bf2c0215e440cad" exitCode=137 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.767812 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerDied","Data":"ec2a331a86ae89e08027ea8c0926de156588edd0f3891883477df5f56fc92e3b"} Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.768375 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerDied","Data":"43d95ddf8ddca41f5b219484cd551d03cea0a69ad0a240973bf2c0215e440cad"} Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.772232 4747 generic.go:334] "Generic (PLEG): container finished" podID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerID="3bbc013cb090d13319e1833b27ebd7d979d09f7cac81e6a9f4a0fd23dc21099d" exitCode=137 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.772258 4747 generic.go:334] "Generic (PLEG): container finished" podID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerID="cdf4258d83ccf77782222800ca06b83ecab290d012d0dbe45fe8ffbe6497e58b" exitCode=137 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.772310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerDied","Data":"3bbc013cb090d13319e1833b27ebd7d979d09f7cac81e6a9f4a0fd23dc21099d"} Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.772354 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerDied","Data":"cdf4258d83ccf77782222800ca06b83ecab290d012d0dbe45fe8ffbe6497e58b"} Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.780951 4747 generic.go:334] "Generic (PLEG): container finished" podID="5252ace2-c979-4b87-80c6-efefb677ac17" containerID="58ed172a06884d3738e7531dce126180532bbc46bcb6528cb3720a779da60796" exitCode=0 Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.780992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerDied","Data":"58ed172a06884d3738e7531dce126180532bbc46bcb6528cb3720a779da60796"} Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.927873 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:24:59 crc kubenswrapper[4747]: I1205 22:24:59.948284 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060503 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9gh\" (UniqueName: \"kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh\") pod \"b1624e91-42d7-4cf9-812f-231dab1d161e\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060606 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key\") pod \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060631 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs\") pod \"b1624e91-42d7-4cf9-812f-231dab1d161e\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060673 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts\") pod \"b1624e91-42d7-4cf9-812f-231dab1d161e\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060698 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key\") pod \"b1624e91-42d7-4cf9-812f-231dab1d161e\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data\") pod \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060855 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts\") pod \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060897 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data\") pod \"b1624e91-42d7-4cf9-812f-231dab1d161e\" (UID: \"b1624e91-42d7-4cf9-812f-231dab1d161e\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060923 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs\") pod \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.060993 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2t6n\" (UniqueName: \"kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n\") pod \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\" (UID: \"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76\") " Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.062637 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs" (OuterVolumeSpecName: "logs") pod "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" (UID: "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.062999 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs" (OuterVolumeSpecName: "logs") pod "b1624e91-42d7-4cf9-812f-231dab1d161e" (UID: "b1624e91-42d7-4cf9-812f-231dab1d161e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.068169 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" (UID: "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.069150 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n" (OuterVolumeSpecName: "kube-api-access-w2t6n") pod "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" (UID: "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76"). InnerVolumeSpecName "kube-api-access-w2t6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.081133 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh" (OuterVolumeSpecName: "kube-api-access-7d9gh") pod "b1624e91-42d7-4cf9-812f-231dab1d161e" (UID: "b1624e91-42d7-4cf9-812f-231dab1d161e"). InnerVolumeSpecName "kube-api-access-7d9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.081824 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b1624e91-42d7-4cf9-812f-231dab1d161e" (UID: "b1624e91-42d7-4cf9-812f-231dab1d161e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.088305 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data" (OuterVolumeSpecName: "config-data") pod "b1624e91-42d7-4cf9-812f-231dab1d161e" (UID: "b1624e91-42d7-4cf9-812f-231dab1d161e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.090073 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data" (OuterVolumeSpecName: "config-data") pod "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" (UID: "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.092924 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts" (OuterVolumeSpecName: "scripts") pod "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" (UID: "6cde40e4-10aa-4bd3-a69e-cdbaaf47de76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.109215 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts" (OuterVolumeSpecName: "scripts") pod "b1624e91-42d7-4cf9-812f-231dab1d161e" (UID: "b1624e91-42d7-4cf9-812f-231dab1d161e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163241 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1624e91-42d7-4cf9-812f-231dab1d161e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163493 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163573 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1624e91-42d7-4cf9-812f-231dab1d161e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163701 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163792 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.163924 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1624e91-42d7-4cf9-812f-231dab1d161e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.164002 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.164078 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2t6n\" (UniqueName: \"kubernetes.io/projected/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-kube-api-access-w2t6n\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.164170 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9gh\" (UniqueName: \"kubernetes.io/projected/b1624e91-42d7-4cf9-812f-231dab1d161e-kube-api-access-7d9gh\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.164244 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.799185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c69d469cc-bwxm6" event={"ID":"b1624e91-42d7-4cf9-812f-231dab1d161e","Type":"ContainerDied","Data":"0d10bc99f10d381e0a1dc37f27c2908c57203f429105ec0b2498ea1335a92d05"} Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.799226 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c69d469cc-bwxm6" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.799252 4747 scope.go:117] "RemoveContainer" containerID="3bbc013cb090d13319e1833b27ebd7d979d09f7cac81e6a9f4a0fd23dc21099d" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.803965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf976bc9-b9lfc" event={"ID":"6cde40e4-10aa-4bd3-a69e-cdbaaf47de76","Type":"ContainerDied","Data":"76a088c318fc474f282abbce29f71e84232fc09a1669bc03b188133356d76fc7"} Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.804071 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf976bc9-b9lfc" Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.873170 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.887752 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77cf976bc9-b9lfc"] Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.896527 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:25:00 crc kubenswrapper[4747]: I1205 22:25:00.904678 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c69d469cc-bwxm6"] Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.019036 4747 scope.go:117] "RemoveContainer" containerID="cdf4258d83ccf77782222800ca06b83ecab290d012d0dbe45fe8ffbe6497e58b" Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.049177 4747 scope.go:117] "RemoveContainer" containerID="ec2a331a86ae89e08027ea8c0926de156588edd0f3891883477df5f56fc92e3b" Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.215531 4747 scope.go:117] "RemoveContainer" containerID="43d95ddf8ddca41f5b219484cd551d03cea0a69ad0a240973bf2c0215e440cad" Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.477366 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.857794 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" path="/var/lib/kubelet/pods/6cde40e4-10aa-4bd3-a69e-cdbaaf47de76/volumes" Dec 05 22:25:01 crc kubenswrapper[4747]: I1205 22:25:01.859165 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" path="/var/lib/kubelet/pods/b1624e91-42d7-4cf9-812f-231dab1d161e/volumes" Dec 05 22:25:06 crc kubenswrapper[4747]: I1205 22:25:06.222441 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:25:06 crc kubenswrapper[4747]: I1205 22:25:06.223067 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:25:11 crc kubenswrapper[4747]: I1205 22:25:11.476392 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 05 22:25:21 crc kubenswrapper[4747]: I1205 22:25:21.476810 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65fbbfd684-8946q" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 05 22:25:21 crc kubenswrapper[4747]: I1205 22:25:21.477471 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:25:25 crc kubenswrapper[4747]: I1205 22:25:25.813943 4747 generic.go:334] "Generic (PLEG): container finished" podID="5252ace2-c979-4b87-80c6-efefb677ac17" containerID="796e2c96a98e8cbcdcce497296b38b9fc18ef173751593982c138006c0e42258" exitCode=137 Dec 05 22:25:25 crc kubenswrapper[4747]: I1205 22:25:25.814244 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerDied","Data":"796e2c96a98e8cbcdcce497296b38b9fc18ef173751593982c138006c0e42258"} Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.252571 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392530 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392620 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392703 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392725 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392739 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.392883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-264wt\" (UniqueName: \"kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt\") pod \"5252ace2-c979-4b87-80c6-efefb677ac17\" (UID: \"5252ace2-c979-4b87-80c6-efefb677ac17\") " Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.393489 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs" (OuterVolumeSpecName: "logs") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.407491 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.407884 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt" (OuterVolumeSpecName: "kube-api-access-264wt") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "kube-api-access-264wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.426286 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.447528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data" (OuterVolumeSpecName: "config-data") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.448805 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.452045 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts" (OuterVolumeSpecName: "scripts") pod "5252ace2-c979-4b87-80c6-efefb677ac17" (UID: "5252ace2-c979-4b87-80c6-efefb677ac17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.495830 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496226 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496315 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496378 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5252ace2-c979-4b87-80c6-efefb677ac17-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496442 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-264wt\" (UniqueName: \"kubernetes.io/projected/5252ace2-c979-4b87-80c6-efefb677ac17-kube-api-access-264wt\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496500 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5252ace2-c979-4b87-80c6-efefb677ac17-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.496553 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5252ace2-c979-4b87-80c6-efefb677ac17-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.856968 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65fbbfd684-8946q" event={"ID":"5252ace2-c979-4b87-80c6-efefb677ac17","Type":"ContainerDied","Data":"f7c13c3865471e86c215252b966c6784f4e46bb9ff64f1a358dde7ba06508202"} Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.857047 4747 scope.go:117] "RemoveContainer" containerID="58ed172a06884d3738e7531dce126180532bbc46bcb6528cb3720a779da60796" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.857251 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65fbbfd684-8946q" Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.929939 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:25:26 crc kubenswrapper[4747]: I1205 22:25:26.945423 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65fbbfd684-8946q"] Dec 05 22:25:27 crc kubenswrapper[4747]: I1205 22:25:27.059014 4747 scope.go:117] "RemoveContainer" containerID="796e2c96a98e8cbcdcce497296b38b9fc18ef173751593982c138006c0e42258" Dec 05 22:25:27 crc kubenswrapper[4747]: I1205 22:25:27.853482 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" path="/var/lib/kubelet/pods/5252ace2-c979-4b87-80c6-efefb677ac17/volumes" Dec 05 22:25:31 crc kubenswrapper[4747]: I1205 22:25:31.261386 4747 scope.go:117] "RemoveContainer" containerID="ab2315f8f216b1fb0c47a6a2abbb6e29c6bc0821bc4cf99ae67abcf4023ef9ab" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.176455 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-595bbd95bb-ds75b"] Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.177956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.177973 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.178012 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178020 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.178034 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178042 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.178060 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178066 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.178075 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178081 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.178088 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178094 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178262 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178284 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178332 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178341 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5252ace2-c979-4b87-80c6-efefb677ac17" containerName="horizon" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178354 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cde40e4-10aa-4bd3-a69e-cdbaaf47de76" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.178366 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1624e91-42d7-4cf9-812f-231dab1d161e" containerName="horizon-log" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.179524 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.193098 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-595bbd95bb-ds75b"] Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.221444 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.221510 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.221563 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.223321 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.223387 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921" gracePeriod=600 Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.324205 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-tls-certs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.324490 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac99fd7b-2307-4539-849a-774e9b2bc774-logs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.324745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-secret-key\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.324851 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmnn\" (UniqueName: \"kubernetes.io/projected/ac99fd7b-2307-4539-849a-774e9b2bc774-kube-api-access-7kmnn\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.324942 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-config-data\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.325037 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-combined-ca-bundle\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.325145 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-scripts\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: E1205 22:25:36.326908 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ba28a1_00e9_438e_9b47_6537f75121bb.slice/crio-0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921.scope\": RecentStats: unable to find data in memory cache]" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.427334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-combined-ca-bundle\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.427767 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-scripts\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.427835 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-tls-certs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.427921 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac99fd7b-2307-4539-849a-774e9b2bc774-logs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.427996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-secret-key\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.428067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmnn\" (UniqueName: \"kubernetes.io/projected/ac99fd7b-2307-4539-849a-774e9b2bc774-kube-api-access-7kmnn\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.428123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-config-data\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.428545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac99fd7b-2307-4539-849a-774e9b2bc774-logs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.429292 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-scripts\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.430006 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac99fd7b-2307-4539-849a-774e9b2bc774-config-data\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.432525 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-secret-key\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.433029 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-horizon-tls-certs\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.435524 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac99fd7b-2307-4539-849a-774e9b2bc774-combined-ca-bundle\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.446864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmnn\" (UniqueName: \"kubernetes.io/projected/ac99fd7b-2307-4539-849a-774e9b2bc774-kube-api-access-7kmnn\") pod \"horizon-595bbd95bb-ds75b\" (UID: \"ac99fd7b-2307-4539-849a-774e9b2bc774\") " pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.501825 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.954616 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921" exitCode=0 Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.954744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921"} Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.955280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f"} Dec 05 22:25:36 crc kubenswrapper[4747]: I1205 22:25:36.955304 4747 scope.go:117] "RemoveContainer" containerID="3e67cf53b8e83d6147083045f4ad366211b13b660a82828bc6085ff4f40ac001" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.021962 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-595bbd95bb-ds75b"] Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.611715 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-tcjmq"] Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.613350 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.642555 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-tcjmq"] Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.698707 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-e619-account-create-update-zth4w"] Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.699947 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.703890 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.735682 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e619-account-create-update-zth4w"] Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.756477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7szb\" (UniqueName: \"kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.756660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.860671 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.860724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q7gz\" (UniqueName: \"kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.860776 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7szb\" (UniqueName: \"kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.860818 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.863483 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.896037 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7szb\" (UniqueName: \"kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb\") pod \"heat-db-create-tcjmq\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.961666 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.961810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q7gz\" (UniqueName: \"kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.962961 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.965648 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-595bbd95bb-ds75b" event={"ID":"ac99fd7b-2307-4539-849a-774e9b2bc774","Type":"ContainerStarted","Data":"d3e432dcfde34a92d67796993bd7c8c066e78f6a7abc3473801bbf90f753d8ba"} Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.965689 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-595bbd95bb-ds75b" event={"ID":"ac99fd7b-2307-4539-849a-774e9b2bc774","Type":"ContainerStarted","Data":"9fba165f8b99b2200592a55cbf591ca41bbf350a0e4fe3a7c552737f4bf91db3"} Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.965699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-595bbd95bb-ds75b" event={"ID":"ac99fd7b-2307-4539-849a-774e9b2bc774","Type":"ContainerStarted","Data":"ae0005afde32f1fb4f0734ed5f15ce47cefb7a31d221121c7170f4bcc27b2c02"} Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.972016 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:37 crc kubenswrapper[4747]: I1205 22:25:37.978661 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q7gz\" (UniqueName: \"kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz\") pod \"heat-e619-account-create-update-zth4w\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.095601 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.490062 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-595bbd95bb-ds75b" podStartSLOduration=2.490042489 podStartE2EDuration="2.490042489s" podCreationTimestamp="2025-12-05 22:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:25:38.00830502 +0000 UTC m=+6208.475612558" watchObservedRunningTime="2025-12-05 22:25:38.490042489 +0000 UTC m=+6208.957349977" Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.491246 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-tcjmq"] Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.605057 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e619-account-create-update-zth4w"] Dec 05 22:25:38 crc kubenswrapper[4747]: W1205 22:25:38.613709 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ec1e051_f8a5_463d_8f9c_bb428934e615.slice/crio-dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511 WatchSource:0}: Error finding container dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511: Status 404 returned error can't find the container with id dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511 Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.977962 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-tcjmq" event={"ID":"b5f053ea-4de8-467e-8d7e-2cb34c061443","Type":"ContainerStarted","Data":"c195241d2c901cea5f872e40f3971452903544c99e4fdea40d639d216227c84d"} Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.978301 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-tcjmq" event={"ID":"b5f053ea-4de8-467e-8d7e-2cb34c061443","Type":"ContainerStarted","Data":"ad93d948511450d41a71ba5585da755c4114484d210692fcd71f402ab9e12d29"} Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.980897 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e619-account-create-update-zth4w" event={"ID":"6ec1e051-f8a5-463d-8f9c-bb428934e615","Type":"ContainerStarted","Data":"24c679f1f2219adb3e37e4ef622bd53700f155fb937f310e515062fb34eab772"} Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.980939 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e619-account-create-update-zth4w" event={"ID":"6ec1e051-f8a5-463d-8f9c-bb428934e615","Type":"ContainerStarted","Data":"dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511"} Dec 05 22:25:38 crc kubenswrapper[4747]: I1205 22:25:38.999137 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-tcjmq" podStartSLOduration=1.999117976 podStartE2EDuration="1.999117976s" podCreationTimestamp="2025-12-05 22:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:25:38.99038542 +0000 UTC m=+6209.457692928" watchObservedRunningTime="2025-12-05 22:25:38.999117976 +0000 UTC m=+6209.466425464" Dec 05 22:25:39 crc kubenswrapper[4747]: I1205 22:25:39.007024 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-e619-account-create-update-zth4w" podStartSLOduration=2.006999112 podStartE2EDuration="2.006999112s" podCreationTimestamp="2025-12-05 22:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:25:39.001041354 +0000 UTC m=+6209.468348842" watchObservedRunningTime="2025-12-05 22:25:39.006999112 +0000 UTC m=+6209.474306620" Dec 05 22:25:39 crc kubenswrapper[4747]: I1205 22:25:39.990759 4747 generic.go:334] "Generic (PLEG): container finished" podID="b5f053ea-4de8-467e-8d7e-2cb34c061443" containerID="c195241d2c901cea5f872e40f3971452903544c99e4fdea40d639d216227c84d" exitCode=0 Dec 05 22:25:39 crc kubenswrapper[4747]: I1205 22:25:39.990995 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-tcjmq" event={"ID":"b5f053ea-4de8-467e-8d7e-2cb34c061443","Type":"ContainerDied","Data":"c195241d2c901cea5f872e40f3971452903544c99e4fdea40d639d216227c84d"} Dec 05 22:25:39 crc kubenswrapper[4747]: I1205 22:25:39.995609 4747 generic.go:334] "Generic (PLEG): container finished" podID="6ec1e051-f8a5-463d-8f9c-bb428934e615" containerID="24c679f1f2219adb3e37e4ef622bd53700f155fb937f310e515062fb34eab772" exitCode=0 Dec 05 22:25:39 crc kubenswrapper[4747]: I1205 22:25:39.995673 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e619-account-create-update-zth4w" event={"ID":"6ec1e051-f8a5-463d-8f9c-bb428934e615","Type":"ContainerDied","Data":"24c679f1f2219adb3e37e4ef622bd53700f155fb937f310e515062fb34eab772"} Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.353549 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.453555 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7szb\" (UniqueName: \"kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb\") pod \"b5f053ea-4de8-467e-8d7e-2cb34c061443\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.460985 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb" (OuterVolumeSpecName: "kube-api-access-l7szb") pod "b5f053ea-4de8-467e-8d7e-2cb34c061443" (UID: "b5f053ea-4de8-467e-8d7e-2cb34c061443"). InnerVolumeSpecName "kube-api-access-l7szb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.464242 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.554919 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts\") pod \"b5f053ea-4de8-467e-8d7e-2cb34c061443\" (UID: \"b5f053ea-4de8-467e-8d7e-2cb34c061443\") " Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.555657 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7szb\" (UniqueName: \"kubernetes.io/projected/b5f053ea-4de8-467e-8d7e-2cb34c061443-kube-api-access-l7szb\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.555804 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5f053ea-4de8-467e-8d7e-2cb34c061443" (UID: "b5f053ea-4de8-467e-8d7e-2cb34c061443"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.656390 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q7gz\" (UniqueName: \"kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz\") pod \"6ec1e051-f8a5-463d-8f9c-bb428934e615\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.656794 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts\") pod \"6ec1e051-f8a5-463d-8f9c-bb428934e615\" (UID: \"6ec1e051-f8a5-463d-8f9c-bb428934e615\") " Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.657364 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ec1e051-f8a5-463d-8f9c-bb428934e615" (UID: "6ec1e051-f8a5-463d-8f9c-bb428934e615"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.657443 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5f053ea-4de8-467e-8d7e-2cb34c061443-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.659951 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz" (OuterVolumeSpecName: "kube-api-access-2q7gz") pod "6ec1e051-f8a5-463d-8f9c-bb428934e615" (UID: "6ec1e051-f8a5-463d-8f9c-bb428934e615"). InnerVolumeSpecName "kube-api-access-2q7gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.760107 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q7gz\" (UniqueName: \"kubernetes.io/projected/6ec1e051-f8a5-463d-8f9c-bb428934e615-kube-api-access-2q7gz\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:41 crc kubenswrapper[4747]: I1205 22:25:41.760174 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ec1e051-f8a5-463d-8f9c-bb428934e615-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.016016 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-tcjmq" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.016024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-tcjmq" event={"ID":"b5f053ea-4de8-467e-8d7e-2cb34c061443","Type":"ContainerDied","Data":"ad93d948511450d41a71ba5585da755c4114484d210692fcd71f402ab9e12d29"} Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.016084 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad93d948511450d41a71ba5585da755c4114484d210692fcd71f402ab9e12d29" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.018951 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e619-account-create-update-zth4w" event={"ID":"6ec1e051-f8a5-463d-8f9c-bb428934e615","Type":"ContainerDied","Data":"dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511"} Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.019029 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9f483d1311270fffa8782de199399ef40032bb5f4d0b0f73f8fb7351e43511" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.019093 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e619-account-create-update-zth4w" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.889823 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-f9rfh"] Dec 05 22:25:42 crc kubenswrapper[4747]: E1205 22:25:42.890627 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f053ea-4de8-467e-8d7e-2cb34c061443" containerName="mariadb-database-create" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.890648 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f053ea-4de8-467e-8d7e-2cb34c061443" containerName="mariadb-database-create" Dec 05 22:25:42 crc kubenswrapper[4747]: E1205 22:25:42.890665 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec1e051-f8a5-463d-8f9c-bb428934e615" containerName="mariadb-account-create-update" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.890674 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec1e051-f8a5-463d-8f9c-bb428934e615" containerName="mariadb-account-create-update" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.890934 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f053ea-4de8-467e-8d7e-2cb34c061443" containerName="mariadb-database-create" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.890961 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec1e051-f8a5-463d-8f9c-bb428934e615" containerName="mariadb-account-create-update" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.891693 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.895967 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.896149 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-b2xx5" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.902895 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f9rfh"] Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.983312 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.983683 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:42 crc kubenswrapper[4747]: I1205 22:25:42.983794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6k2\" (UniqueName: \"kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.085238 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.086472 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.086604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6k2\" (UniqueName: \"kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.090868 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.091687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.108682 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6k2\" (UniqueName: \"kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2\") pod \"heat-db-sync-f9rfh\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.210648 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:43 crc kubenswrapper[4747]: I1205 22:25:43.503554 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f9rfh"] Dec 05 22:25:44 crc kubenswrapper[4747]: I1205 22:25:44.041205 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9rfh" event={"ID":"b659051b-3448-44d2-a6f0-d8723d0ffab1","Type":"ContainerStarted","Data":"cbcabf983baa800bac86691f9b5ad7bad0a98670c03604fd67c31eeaa00cac7f"} Dec 05 22:25:46 crc kubenswrapper[4747]: I1205 22:25:46.502224 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:46 crc kubenswrapper[4747]: I1205 22:25:46.502788 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:25:50 crc kubenswrapper[4747]: I1205 22:25:50.110750 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9rfh" event={"ID":"b659051b-3448-44d2-a6f0-d8723d0ffab1","Type":"ContainerStarted","Data":"c237a1be16b8675c2ac2e3bbc7fa4204d153e681fe976968de65c60802c10cc6"} Dec 05 22:25:50 crc kubenswrapper[4747]: I1205 22:25:50.149832 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-f9rfh" podStartSLOduration=2.162768303 podStartE2EDuration="8.149804017s" podCreationTimestamp="2025-12-05 22:25:42 +0000 UTC" firstStartedPulling="2025-12-05 22:25:43.511987844 +0000 UTC m=+6213.979295322" lastFinishedPulling="2025-12-05 22:25:49.499023548 +0000 UTC m=+6219.966331036" observedRunningTime="2025-12-05 22:25:50.135046181 +0000 UTC m=+6220.602353709" watchObservedRunningTime="2025-12-05 22:25:50.149804017 +0000 UTC m=+6220.617111535" Dec 05 22:25:52 crc kubenswrapper[4747]: I1205 22:25:52.135296 4747 generic.go:334] "Generic (PLEG): container finished" podID="b659051b-3448-44d2-a6f0-d8723d0ffab1" containerID="c237a1be16b8675c2ac2e3bbc7fa4204d153e681fe976968de65c60802c10cc6" exitCode=0 Dec 05 22:25:52 crc kubenswrapper[4747]: I1205 22:25:52.135759 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9rfh" event={"ID":"b659051b-3448-44d2-a6f0-d8723d0ffab1","Type":"ContainerDied","Data":"c237a1be16b8675c2ac2e3bbc7fa4204d153e681fe976968de65c60802c10cc6"} Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.490490 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.522749 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf6k2\" (UniqueName: \"kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2\") pod \"b659051b-3448-44d2-a6f0-d8723d0ffab1\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.522850 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle\") pod \"b659051b-3448-44d2-a6f0-d8723d0ffab1\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.522886 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data\") pod \"b659051b-3448-44d2-a6f0-d8723d0ffab1\" (UID: \"b659051b-3448-44d2-a6f0-d8723d0ffab1\") " Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.532724 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2" (OuterVolumeSpecName: "kube-api-access-rf6k2") pod "b659051b-3448-44d2-a6f0-d8723d0ffab1" (UID: "b659051b-3448-44d2-a6f0-d8723d0ffab1"). InnerVolumeSpecName "kube-api-access-rf6k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.584192 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b659051b-3448-44d2-a6f0-d8723d0ffab1" (UID: "b659051b-3448-44d2-a6f0-d8723d0ffab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.611597 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data" (OuterVolumeSpecName: "config-data") pod "b659051b-3448-44d2-a6f0-d8723d0ffab1" (UID: "b659051b-3448-44d2-a6f0-d8723d0ffab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.624568 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf6k2\" (UniqueName: \"kubernetes.io/projected/b659051b-3448-44d2-a6f0-d8723d0ffab1-kube-api-access-rf6k2\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.624620 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:53 crc kubenswrapper[4747]: I1205 22:25:53.624632 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b659051b-3448-44d2-a6f0-d8723d0ffab1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:25:54 crc kubenswrapper[4747]: I1205 22:25:54.157930 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9rfh" event={"ID":"b659051b-3448-44d2-a6f0-d8723d0ffab1","Type":"ContainerDied","Data":"cbcabf983baa800bac86691f9b5ad7bad0a98670c03604fd67c31eeaa00cac7f"} Dec 05 22:25:54 crc kubenswrapper[4747]: I1205 22:25:54.158285 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcabf983baa800bac86691f9b5ad7bad0a98670c03604fd67c31eeaa00cac7f" Dec 05 22:25:54 crc kubenswrapper[4747]: I1205 22:25:54.158015 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9rfh" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.340562 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:25:55 crc kubenswrapper[4747]: E1205 22:25:55.341364 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b659051b-3448-44d2-a6f0-d8723d0ffab1" containerName="heat-db-sync" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.341380 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b659051b-3448-44d2-a6f0-d8723d0ffab1" containerName="heat-db-sync" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.341668 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b659051b-3448-44d2-a6f0-d8723d0ffab1" containerName="heat-db-sync" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.342427 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.347533 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.347680 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-b2xx5" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.347805 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.350053 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.463086 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.463166 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jv5\" (UniqueName: \"kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.463251 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.463356 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.485267 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.486688 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.489038 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.498979 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.564540 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jv5\" (UniqueName: \"kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.564680 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.564745 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.564847 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.576836 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.579795 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.583724 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.589779 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jv5\" (UniqueName: \"kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5\") pod \"heat-engine-b947f8656-gpbz8\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.594706 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.596241 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.601082 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.615622 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.667156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.667243 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrtz\" (UniqueName: \"kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.667272 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.667299 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.667661 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.768959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769001 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769061 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769132 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrtz\" (UniqueName: \"kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.769546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68kz\" (UniqueName: \"kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.774453 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.778306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.778456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.788271 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrtz\" (UniqueName: \"kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz\") pod \"heat-cfnapi-7cd4549b77-669hj\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.810147 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.872707 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.872840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h68kz\" (UniqueName: \"kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.872962 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.872988 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.884506 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.891494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.894609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.901597 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68kz\" (UniqueName: \"kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz\") pod \"heat-api-7765dc6db9-khj9s\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:55 crc kubenswrapper[4747]: I1205 22:25:55.988245 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:56 crc kubenswrapper[4747]: I1205 22:25:56.263041 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:25:56 crc kubenswrapper[4747]: I1205 22:25:56.449841 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:25:56 crc kubenswrapper[4747]: W1205 22:25:56.450321 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef536ee3_bbdb_4c17_b244_c88cdda13d41.slice/crio-ef7d119811da274151d0334cafee53a5beaec183cef2998582ed9162739d6315 WatchSource:0}: Error finding container ef7d119811da274151d0334cafee53a5beaec183cef2998582ed9162739d6315: Status 404 returned error can't find the container with id ef7d119811da274151d0334cafee53a5beaec183cef2998582ed9162739d6315 Dec 05 22:25:56 crc kubenswrapper[4747]: I1205 22:25:56.505845 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-595bbd95bb-ds75b" podUID="ac99fd7b-2307-4539-849a-774e9b2bc774" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.119:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8443: connect: connection refused" Dec 05 22:25:56 crc kubenswrapper[4747]: W1205 22:25:56.636526 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eb63f02_f4e5_483d_8070_521435e159f1.slice/crio-134de597b540b75d5585b9ca5d57228d523715c67e2d9fbd15810998a2f311dc WatchSource:0}: Error finding container 134de597b540b75d5585b9ca5d57228d523715c67e2d9fbd15810998a2f311dc: Status 404 returned error can't find the container with id 134de597b540b75d5585b9ca5d57228d523715c67e2d9fbd15810998a2f311dc Dec 05 22:25:56 crc kubenswrapper[4747]: I1205 22:25:56.638401 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:25:57 crc kubenswrapper[4747]: I1205 22:25:57.192110 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cd4549b77-669hj" event={"ID":"ef536ee3-bbdb-4c17-b244-c88cdda13d41","Type":"ContainerStarted","Data":"ef7d119811da274151d0334cafee53a5beaec183cef2998582ed9162739d6315"} Dec 05 22:25:57 crc kubenswrapper[4747]: I1205 22:25:57.193972 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-b947f8656-gpbz8" event={"ID":"3863f9de-d316-444c-9417-41b460fd21c5","Type":"ContainerStarted","Data":"8a2defa804a5c89910bf9cb69230ff4e1ca03ac0de6622a680544f9c7889ee31"} Dec 05 22:25:57 crc kubenswrapper[4747]: I1205 22:25:57.195438 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7765dc6db9-khj9s" event={"ID":"0eb63f02-f4e5-483d-8070-521435e159f1","Type":"ContainerStarted","Data":"134de597b540b75d5585b9ca5d57228d523715c67e2d9fbd15810998a2f311dc"} Dec 05 22:25:58 crc kubenswrapper[4747]: I1205 22:25:58.204538 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-b947f8656-gpbz8" event={"ID":"3863f9de-d316-444c-9417-41b460fd21c5","Type":"ContainerStarted","Data":"f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab"} Dec 05 22:25:58 crc kubenswrapper[4747]: I1205 22:25:58.205330 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:25:58 crc kubenswrapper[4747]: I1205 22:25:58.235659 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-b947f8656-gpbz8" podStartSLOduration=3.235640568 podStartE2EDuration="3.235640568s" podCreationTimestamp="2025-12-05 22:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:25:58.226839339 +0000 UTC m=+6228.694146827" watchObservedRunningTime="2025-12-05 22:25:58.235640568 +0000 UTC m=+6228.702948056" Dec 05 22:25:59 crc kubenswrapper[4747]: I1205 22:25:59.215007 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7765dc6db9-khj9s" event={"ID":"0eb63f02-f4e5-483d-8070-521435e159f1","Type":"ContainerStarted","Data":"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86"} Dec 05 22:25:59 crc kubenswrapper[4747]: I1205 22:25:59.215536 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:25:59 crc kubenswrapper[4747]: I1205 22:25:59.216528 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cd4549b77-669hj" event={"ID":"ef536ee3-bbdb-4c17-b244-c88cdda13d41","Type":"ContainerStarted","Data":"abd0ee38367e950b5db78545e133515f21208d2c9d842b866a3cfc4f245350b8"} Dec 05 22:25:59 crc kubenswrapper[4747]: I1205 22:25:59.239063 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7765dc6db9-khj9s" podStartSLOduration=2.059498407 podStartE2EDuration="4.239045915s" podCreationTimestamp="2025-12-05 22:25:55 +0000 UTC" firstStartedPulling="2025-12-05 22:25:56.638698668 +0000 UTC m=+6227.106006156" lastFinishedPulling="2025-12-05 22:25:58.818246186 +0000 UTC m=+6229.285553664" observedRunningTime="2025-12-05 22:25:59.229408616 +0000 UTC m=+6229.696716104" watchObservedRunningTime="2025-12-05 22:25:59.239045915 +0000 UTC m=+6229.706353393" Dec 05 22:25:59 crc kubenswrapper[4747]: I1205 22:25:59.254483 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7cd4549b77-669hj" podStartSLOduration=1.895640846 podStartE2EDuration="4.254463947s" podCreationTimestamp="2025-12-05 22:25:55 +0000 UTC" firstStartedPulling="2025-12-05 22:25:56.455957399 +0000 UTC m=+6226.923264887" lastFinishedPulling="2025-12-05 22:25:58.8147805 +0000 UTC m=+6229.282087988" observedRunningTime="2025-12-05 22:25:59.249000992 +0000 UTC m=+6229.716308480" watchObservedRunningTime="2025-12-05 22:25:59.254463947 +0000 UTC m=+6229.721771435" Dec 05 22:26:00 crc kubenswrapper[4747]: I1205 22:26:00.265121 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.344967 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7579685f87-4bh5x"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.346728 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.360069 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7579685f87-4bh5x"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.373213 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.374537 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.382383 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.383545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.420846 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.464499 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrm6m\" (UniqueName: \"kubernetes.io/projected/0388fe0f-6132-450a-a8e0-038f04d0c73c-kube-api-access-hrm6m\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmc2f\" (UniqueName: \"kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511681 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511933 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data-custom\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511972 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.511995 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.512105 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.512134 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.512180 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.512271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-combined-ca-bundle\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.613884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.613961 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrm6m\" (UniqueName: \"kubernetes.io/projected/0388fe0f-6132-450a-a8e0-038f04d0c73c-kube-api-access-hrm6m\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614016 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmc2f\" (UniqueName: \"kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614048 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614164 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data-custom\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614217 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614277 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614303 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614337 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.614374 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-combined-ca-bundle\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.620530 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.633491 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.634383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-combined-ca-bundle\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.634595 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.634632 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.634664 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.635359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.635599 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0388fe0f-6132-450a-a8e0-038f04d0c73c-config-data-custom\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.636418 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.637214 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrm6m\" (UniqueName: \"kubernetes.io/projected/0388fe0f-6132-450a-a8e0-038f04d0c73c-kube-api-access-hrm6m\") pod \"heat-engine-7579685f87-4bh5x\" (UID: \"0388fe0f-6132-450a-a8e0-038f04d0c73c\") " pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.637507 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw\") pod \"heat-api-98878b4f5-vwqb6\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.638170 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmc2f\" (UniqueName: \"kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f\") pod \"heat-cfnapi-6c4f4fb6b4-sptkl\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.728519 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.752184 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:02 crc kubenswrapper[4747]: I1205 22:26:02.760209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:03 crc kubenswrapper[4747]: W1205 22:26:03.307300 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf13486a_dbe1_4e12_a17d_7ae86f421788.slice/crio-e7ae30236bf5586dd8e02d4a4865f53a9e619c1901c87793800f4ee0b2d72da8 WatchSource:0}: Error finding container e7ae30236bf5586dd8e02d4a4865f53a9e619c1901c87793800f4ee0b2d72da8: Status 404 returned error can't find the container with id e7ae30236bf5586dd8e02d4a4865f53a9e619c1901c87793800f4ee0b2d72da8 Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.311686 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.488402 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.489158 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7765dc6db9-khj9s" podUID="0eb63f02-f4e5-483d-8070-521435e159f1" containerName="heat-api" containerID="cri-o://6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86" gracePeriod=60 Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.532709 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.532952 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7cd4549b77-669hj" podUID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" containerName="heat-cfnapi" containerID="cri-o://abd0ee38367e950b5db78545e133515f21208d2c9d842b866a3cfc4f245350b8" gracePeriod=60 Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.553563 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-798d94bcc6-bvzqv"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.555158 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.566000 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.566511 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.588812 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-798d94bcc6-bvzqv"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.631666 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-cb88b49d8-v5mjj"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.656123 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cb88b49d8-v5mjj"] Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.656232 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.662893 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvcp\" (UniqueName: \"kubernetes.io/projected/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-kube-api-access-2pvcp\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.663066 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-public-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.663390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.663536 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-combined-ca-bundle\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.663716 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data-custom\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.663851 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-internal-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.664227 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.670075 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.765904 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-internal-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.765947 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-public-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.765976 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766004 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-combined-ca-bundle\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766038 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-combined-ca-bundle\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766079 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hjn\" (UniqueName: \"kubernetes.io/projected/2aa007bc-2c4d-406b-a962-5c82ec256692-kube-api-access-x7hjn\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766095 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data-custom\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766127 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-internal-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766141 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766202 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data-custom\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766222 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvcp\" (UniqueName: \"kubernetes.io/projected/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-kube-api-access-2pvcp\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.766300 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-public-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.772509 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-internal-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.772674 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-combined-ca-bundle\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.782375 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data-custom\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.782807 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-public-tls-certs\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.786481 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-config-data\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.787450 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvcp\" (UniqueName: \"kubernetes.io/projected/557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9-kube-api-access-2pvcp\") pod \"heat-api-798d94bcc6-bvzqv\" (UID: \"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9\") " pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.869725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hjn\" (UniqueName: \"kubernetes.io/projected/2aa007bc-2c4d-406b-a962-5c82ec256692-kube-api-access-x7hjn\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.869889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.870062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data-custom\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.870257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-internal-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.870351 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-public-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.870684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-combined-ca-bundle\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.876681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-combined-ca-bundle\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.879873 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data-custom\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.881005 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-config-data\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.881529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-internal-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.882692 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa007bc-2c4d-406b-a962-5c82ec256692-public-tls-certs\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.887654 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hjn\" (UniqueName: \"kubernetes.io/projected/2aa007bc-2c4d-406b-a962-5c82ec256692-kube-api-access-x7hjn\") pod \"heat-cfnapi-cb88b49d8-v5mjj\" (UID: \"2aa007bc-2c4d-406b-a962-5c82ec256692\") " pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.891304 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:03 crc kubenswrapper[4747]: I1205 22:26:03.988373 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:03 crc kubenswrapper[4747]: W1205 22:26:03.997471 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0388fe0f_6132_450a_a8e0_038f04d0c73c.slice/crio-eeaed0ae65d3554980dfd6bd90dd598f34b93585f5849ec4ccca50b4067c6a46 WatchSource:0}: Error finding container eeaed0ae65d3554980dfd6bd90dd598f34b93585f5849ec4ccca50b4067c6a46: Status 404 returned error can't find the container with id eeaed0ae65d3554980dfd6bd90dd598f34b93585f5849ec4ccca50b4067c6a46 Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.012608 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7579685f87-4bh5x"] Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.017560 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.260161 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.308465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-98878b4f5-vwqb6" event={"ID":"e1c9960b-9312-47ae-b9df-77a915c8ccde","Type":"ContainerStarted","Data":"391469927d72fa12c68a383bdb3072a3ae22f628ff76758929c087019b61ad6a"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.310510 4747 generic.go:334] "Generic (PLEG): container finished" podID="0eb63f02-f4e5-483d-8070-521435e159f1" containerID="6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86" exitCode=0 Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.310592 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7765dc6db9-khj9s" event={"ID":"0eb63f02-f4e5-483d-8070-521435e159f1","Type":"ContainerDied","Data":"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.310621 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7765dc6db9-khj9s" event={"ID":"0eb63f02-f4e5-483d-8070-521435e159f1","Type":"ContainerDied","Data":"134de597b540b75d5585b9ca5d57228d523715c67e2d9fbd15810998a2f311dc"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.310638 4747 scope.go:117] "RemoveContainer" containerID="6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.310805 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7765dc6db9-khj9s" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.319899 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7579685f87-4bh5x" event={"ID":"0388fe0f-6132-450a-a8e0-038f04d0c73c","Type":"ContainerStarted","Data":"b44d2fd16fc94a8667dcd13d282c1d2cbd7a9cf2b599c974ccbade4c518c03c3"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.319940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7579685f87-4bh5x" event={"ID":"0388fe0f-6132-450a-a8e0-038f04d0c73c","Type":"ContainerStarted","Data":"eeaed0ae65d3554980dfd6bd90dd598f34b93585f5849ec4ccca50b4067c6a46"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.321120 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.326495 4747 generic.go:334] "Generic (PLEG): container finished" podID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" containerID="abd0ee38367e950b5db78545e133515f21208d2c9d842b866a3cfc4f245350b8" exitCode=0 Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.326547 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cd4549b77-669hj" event={"ID":"ef536ee3-bbdb-4c17-b244-c88cdda13d41","Type":"ContainerDied","Data":"abd0ee38367e950b5db78545e133515f21208d2c9d842b866a3cfc4f245350b8"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.332745 4747 generic.go:334] "Generic (PLEG): container finished" podID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerID="74eac708e00e4fbef109ee49aecc9235bc511d3c5bdde4a981501bd5a9b992ff" exitCode=1 Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.332781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" event={"ID":"df13486a-dbe1-4e12-a17d-7ae86f421788","Type":"ContainerDied","Data":"74eac708e00e4fbef109ee49aecc9235bc511d3c5bdde4a981501bd5a9b992ff"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.332802 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" event={"ID":"df13486a-dbe1-4e12-a17d-7ae86f421788","Type":"ContainerStarted","Data":"e7ae30236bf5586dd8e02d4a4865f53a9e619c1901c87793800f4ee0b2d72da8"} Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.333464 4747 scope.go:117] "RemoveContainer" containerID="74eac708e00e4fbef109ee49aecc9235bc511d3c5bdde4a981501bd5a9b992ff" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.339456 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7579685f87-4bh5x" podStartSLOduration=2.339442034 podStartE2EDuration="2.339442034s" podCreationTimestamp="2025-12-05 22:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:26:04.337544007 +0000 UTC m=+6234.804851505" watchObservedRunningTime="2025-12-05 22:26:04.339442034 +0000 UTC m=+6234.806749522" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.353917 4747 scope.go:117] "RemoveContainer" containerID="6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86" Dec 05 22:26:04 crc kubenswrapper[4747]: E1205 22:26:04.354343 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86\": container with ID starting with 6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86 not found: ID does not exist" containerID="6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.354399 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86"} err="failed to get container status \"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86\": rpc error: code = NotFound desc = could not find container \"6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86\": container with ID starting with 6a6179a0af2ff84dbb3f1388b8556febc38ac7c9a659b9a9a445cb5d3ff4fc86 not found: ID does not exist" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.397377 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom\") pod \"0eb63f02-f4e5-483d-8070-521435e159f1\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.397450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle\") pod \"0eb63f02-f4e5-483d-8070-521435e159f1\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.397626 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data\") pod \"0eb63f02-f4e5-483d-8070-521435e159f1\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.397665 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h68kz\" (UniqueName: \"kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz\") pod \"0eb63f02-f4e5-483d-8070-521435e159f1\" (UID: \"0eb63f02-f4e5-483d-8070-521435e159f1\") " Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.403340 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0eb63f02-f4e5-483d-8070-521435e159f1" (UID: "0eb63f02-f4e5-483d-8070-521435e159f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.416775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz" (OuterVolumeSpecName: "kube-api-access-h68kz") pod "0eb63f02-f4e5-483d-8070-521435e159f1" (UID: "0eb63f02-f4e5-483d-8070-521435e159f1"). InnerVolumeSpecName "kube-api-access-h68kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.444719 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eb63f02-f4e5-483d-8070-521435e159f1" (UID: "0eb63f02-f4e5-483d-8070-521435e159f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.483717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data" (OuterVolumeSpecName: "config-data") pod "0eb63f02-f4e5-483d-8070-521435e159f1" (UID: "0eb63f02-f4e5-483d-8070-521435e159f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.500769 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.501055 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h68kz\" (UniqueName: \"kubernetes.io/projected/0eb63f02-f4e5-483d-8070-521435e159f1-kube-api-access-h68kz\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.501066 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.501075 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb63f02-f4e5-483d-8070-521435e159f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.625103 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-798d94bcc6-bvzqv"] Dec 05 22:26:04 crc kubenswrapper[4747]: I1205 22:26:04.738017 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cb88b49d8-v5mjj"] Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.105977 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.149440 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.165884 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7765dc6db9-khj9s"] Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.214667 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrtz\" (UniqueName: \"kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz\") pod \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.214816 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data\") pod \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.214888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle\") pod \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.215001 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom\") pod \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\" (UID: \"ef536ee3-bbdb-4c17-b244-c88cdda13d41\") " Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.220707 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ef536ee3-bbdb-4c17-b244-c88cdda13d41" (UID: "ef536ee3-bbdb-4c17-b244-c88cdda13d41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.221203 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz" (OuterVolumeSpecName: "kube-api-access-2mrtz") pod "ef536ee3-bbdb-4c17-b244-c88cdda13d41" (UID: "ef536ee3-bbdb-4c17-b244-c88cdda13d41"). InnerVolumeSpecName "kube-api-access-2mrtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.248735 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef536ee3-bbdb-4c17-b244-c88cdda13d41" (UID: "ef536ee3-bbdb-4c17-b244-c88cdda13d41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.286299 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data" (OuterVolumeSpecName: "config-data") pod "ef536ee3-bbdb-4c17-b244-c88cdda13d41" (UID: "ef536ee3-bbdb-4c17-b244-c88cdda13d41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.316869 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.316901 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrtz\" (UniqueName: \"kubernetes.io/projected/ef536ee3-bbdb-4c17-b244-c88cdda13d41-kube-api-access-2mrtz\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.316914 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.316921 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef536ee3-bbdb-4c17-b244-c88cdda13d41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.344935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" event={"ID":"2aa007bc-2c4d-406b-a962-5c82ec256692","Type":"ContainerStarted","Data":"d478e6f717a8f7316a67b6b30deb2018462f9eb425d75f5f803d243d866856c9"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.344992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" event={"ID":"2aa007bc-2c4d-406b-a962-5c82ec256692","Type":"ContainerStarted","Data":"a87bfe65617d4ea6ac417cbbe9fbd7b8e8881d0bb17e0806bc7de230835b24da"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.346663 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.348382 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cd4549b77-669hj" event={"ID":"ef536ee3-bbdb-4c17-b244-c88cdda13d41","Type":"ContainerDied","Data":"ef7d119811da274151d0334cafee53a5beaec183cef2998582ed9162739d6315"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.348436 4747 scope.go:117] "RemoveContainer" containerID="abd0ee38367e950b5db78545e133515f21208d2c9d842b866a3cfc4f245350b8" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.348551 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cd4549b77-669hj" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.352578 4747 generic.go:334] "Generic (PLEG): container finished" podID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" exitCode=1 Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.352644 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" event={"ID":"df13486a-dbe1-4e12-a17d-7ae86f421788","Type":"ContainerDied","Data":"4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.353271 4747 scope.go:117] "RemoveContainer" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" Dec 05 22:26:05 crc kubenswrapper[4747]: E1205 22:26:05.353495 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c4f4fb6b4-sptkl_openstack(df13486a-dbe1-4e12-a17d-7ae86f421788)\"" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.357853 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-798d94bcc6-bvzqv" event={"ID":"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9","Type":"ContainerStarted","Data":"ef22cb7426f5b3fecdbf6730231d61ca242c174ecd9a84a9509d3cba6ad92d27"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.357898 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-798d94bcc6-bvzqv" event={"ID":"557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9","Type":"ContainerStarted","Data":"98478a6e813d8cd722fbaf79e1cb3171561e0d30f5a021d888bdecd1c34a84a9"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.358864 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.360567 4747 generic.go:334] "Generic (PLEG): container finished" podID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerID="1a37e36e8e4e7f49d1e221da854b56fb35eeae0433294200fe8ae98c615e4671" exitCode=1 Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.360609 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-98878b4f5-vwqb6" event={"ID":"e1c9960b-9312-47ae-b9df-77a915c8ccde","Type":"ContainerDied","Data":"1a37e36e8e4e7f49d1e221da854b56fb35eeae0433294200fe8ae98c615e4671"} Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.360948 4747 scope.go:117] "RemoveContainer" containerID="1a37e36e8e4e7f49d1e221da854b56fb35eeae0433294200fe8ae98c615e4671" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.371909 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" podStartSLOduration=2.371882393 podStartE2EDuration="2.371882393s" podCreationTimestamp="2025-12-05 22:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:26:05.367882694 +0000 UTC m=+6235.835190192" watchObservedRunningTime="2025-12-05 22:26:05.371882393 +0000 UTC m=+6235.839189881" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.385811 4747 scope.go:117] "RemoveContainer" containerID="74eac708e00e4fbef109ee49aecc9235bc511d3c5bdde4a981501bd5a9b992ff" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.413788 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-798d94bcc6-bvzqv" podStartSLOduration=2.413767281 podStartE2EDuration="2.413767281s" podCreationTimestamp="2025-12-05 22:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:26:05.405636019 +0000 UTC m=+6235.872943527" watchObservedRunningTime="2025-12-05 22:26:05.413767281 +0000 UTC m=+6235.881074769" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.452751 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.464980 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7cd4549b77-669hj"] Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.855403 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb63f02-f4e5-483d-8070-521435e159f1" path="/var/lib/kubelet/pods/0eb63f02-f4e5-483d-8070-521435e159f1/volumes" Dec 05 22:26:05 crc kubenswrapper[4747]: I1205 22:26:05.855959 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" path="/var/lib/kubelet/pods/ef536ee3-bbdb-4c17-b244-c88cdda13d41/volumes" Dec 05 22:26:06 crc kubenswrapper[4747]: I1205 22:26:06.372478 4747 generic.go:334] "Generic (PLEG): container finished" podID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" exitCode=1 Dec 05 22:26:06 crc kubenswrapper[4747]: I1205 22:26:06.372565 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-98878b4f5-vwqb6" event={"ID":"e1c9960b-9312-47ae-b9df-77a915c8ccde","Type":"ContainerDied","Data":"df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3"} Dec 05 22:26:06 crc kubenswrapper[4747]: I1205 22:26:06.372851 4747 scope.go:117] "RemoveContainer" containerID="1a37e36e8e4e7f49d1e221da854b56fb35eeae0433294200fe8ae98c615e4671" Dec 05 22:26:06 crc kubenswrapper[4747]: I1205 22:26:06.373504 4747 scope.go:117] "RemoveContainer" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" Dec 05 22:26:06 crc kubenswrapper[4747]: E1205 22:26:06.374293 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-98878b4f5-vwqb6_openstack(e1c9960b-9312-47ae-b9df-77a915c8ccde)\"" pod="openstack/heat-api-98878b4f5-vwqb6" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" Dec 05 22:26:06 crc kubenswrapper[4747]: I1205 22:26:06.377606 4747 scope.go:117] "RemoveContainer" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" Dec 05 22:26:06 crc kubenswrapper[4747]: E1205 22:26:06.377946 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c4f4fb6b4-sptkl_openstack(df13486a-dbe1-4e12-a17d-7ae86f421788)\"" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.388883 4747 scope.go:117] "RemoveContainer" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" Dec 05 22:26:07 crc kubenswrapper[4747]: E1205 22:26:07.389428 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-98878b4f5-vwqb6_openstack(e1c9960b-9312-47ae-b9df-77a915c8ccde)\"" pod="openstack/heat-api-98878b4f5-vwqb6" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.752983 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.753452 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.761538 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.761802 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:07 crc kubenswrapper[4747]: I1205 22:26:07.762828 4747 scope.go:117] "RemoveContainer" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" Dec 05 22:26:07 crc kubenswrapper[4747]: E1205 22:26:07.763138 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c4f4fb6b4-sptkl_openstack(df13486a-dbe1-4e12-a17d-7ae86f421788)\"" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" Dec 05 22:26:08 crc kubenswrapper[4747]: I1205 22:26:08.401293 4747 scope.go:117] "RemoveContainer" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" Dec 05 22:26:08 crc kubenswrapper[4747]: I1205 22:26:08.402902 4747 scope.go:117] "RemoveContainer" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" Dec 05 22:26:08 crc kubenswrapper[4747]: E1205 22:26:08.403431 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-98878b4f5-vwqb6_openstack(e1c9960b-9312-47ae-b9df-77a915c8ccde)\"" pod="openstack/heat-api-98878b4f5-vwqb6" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" Dec 05 22:26:08 crc kubenswrapper[4747]: E1205 22:26:08.403516 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c4f4fb6b4-sptkl_openstack(df13486a-dbe1-4e12-a17d-7ae86f421788)\"" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" Dec 05 22:26:08 crc kubenswrapper[4747]: I1205 22:26:08.430313 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:26:09 crc kubenswrapper[4747]: I1205 22:26:09.410253 4747 scope.go:117] "RemoveContainer" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" Dec 05 22:26:09 crc kubenswrapper[4747]: E1205 22:26:09.410542 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-98878b4f5-vwqb6_openstack(e1c9960b-9312-47ae-b9df-77a915c8ccde)\"" pod="openstack/heat-api-98878b4f5-vwqb6" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" Dec 05 22:26:10 crc kubenswrapper[4747]: I1205 22:26:10.207084 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-595bbd95bb-ds75b" Dec 05 22:26:10 crc kubenswrapper[4747]: I1205 22:26:10.288065 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:26:10 crc kubenswrapper[4747]: I1205 22:26:10.288493 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon-log" containerID="cri-o://ecd6b6af989a58fcf969fabaa59cac445b385a5ee906a698bc3c47ac254ebcaf" gracePeriod=30 Dec 05 22:26:10 crc kubenswrapper[4747]: I1205 22:26:10.288479 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" containerID="cri-o://cca07c67efbe86f856864731c3e7e175e1af7b9e35689055628e804307812f13" gracePeriod=30 Dec 05 22:26:13 crc kubenswrapper[4747]: I1205 22:26:13.437037 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:60508->10.217.1.116:8443: read: connection reset by peer" Dec 05 22:26:14 crc kubenswrapper[4747]: I1205 22:26:14.461806 4747 generic.go:334] "Generic (PLEG): container finished" podID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerID="cca07c67efbe86f856864731c3e7e175e1af7b9e35689055628e804307812f13" exitCode=0 Dec 05 22:26:14 crc kubenswrapper[4747]: I1205 22:26:14.461959 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerDied","Data":"cca07c67efbe86f856864731c3e7e175e1af7b9e35689055628e804307812f13"} Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.154131 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-798d94bcc6-bvzqv" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.244172 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.375256 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-cb88b49d8-v5mjj" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.438925 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.688444 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.720901 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.779346 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom\") pod \"e1c9960b-9312-47ae-b9df-77a915c8ccde\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.779396 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle\") pod \"e1c9960b-9312-47ae-b9df-77a915c8ccde\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.779430 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw\") pod \"e1c9960b-9312-47ae-b9df-77a915c8ccde\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.779593 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data\") pod \"e1c9960b-9312-47ae-b9df-77a915c8ccde\" (UID: \"e1c9960b-9312-47ae-b9df-77a915c8ccde\") " Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.811797 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw" (OuterVolumeSpecName: "kube-api-access-cw6nw") pod "e1c9960b-9312-47ae-b9df-77a915c8ccde" (UID: "e1c9960b-9312-47ae-b9df-77a915c8ccde"). InnerVolumeSpecName "kube-api-access-cw6nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.832751 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1c9960b-9312-47ae-b9df-77a915c8ccde" (UID: "e1c9960b-9312-47ae-b9df-77a915c8ccde"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.887446 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.887472 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/e1c9960b-9312-47ae-b9df-77a915c8ccde-kube-api-access-cw6nw\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.943692 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c9960b-9312-47ae-b9df-77a915c8ccde" (UID: "e1c9960b-9312-47ae-b9df-77a915c8ccde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.967524 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.975755 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data" (OuterVolumeSpecName: "config-data") pod "e1c9960b-9312-47ae-b9df-77a915c8ccde" (UID: "e1c9960b-9312-47ae-b9df-77a915c8ccde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.999375 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:15 crc kubenswrapper[4747]: I1205 22:26:15.999412 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c9960b-9312-47ae-b9df-77a915c8ccde-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.101509 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom\") pod \"df13486a-dbe1-4e12-a17d-7ae86f421788\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.101573 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle\") pod \"df13486a-dbe1-4e12-a17d-7ae86f421788\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.101660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmc2f\" (UniqueName: \"kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f\") pod \"df13486a-dbe1-4e12-a17d-7ae86f421788\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.101700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data\") pod \"df13486a-dbe1-4e12-a17d-7ae86f421788\" (UID: \"df13486a-dbe1-4e12-a17d-7ae86f421788\") " Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.104675 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f" (OuterVolumeSpecName: "kube-api-access-xmc2f") pod "df13486a-dbe1-4e12-a17d-7ae86f421788" (UID: "df13486a-dbe1-4e12-a17d-7ae86f421788"). InnerVolumeSpecName "kube-api-access-xmc2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.105617 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df13486a-dbe1-4e12-a17d-7ae86f421788" (UID: "df13486a-dbe1-4e12-a17d-7ae86f421788"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.137022 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df13486a-dbe1-4e12-a17d-7ae86f421788" (UID: "df13486a-dbe1-4e12-a17d-7ae86f421788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.158416 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data" (OuterVolumeSpecName: "config-data") pod "df13486a-dbe1-4e12-a17d-7ae86f421788" (UID: "df13486a-dbe1-4e12-a17d-7ae86f421788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.205186 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.205235 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.205248 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmc2f\" (UniqueName: \"kubernetes.io/projected/df13486a-dbe1-4e12-a17d-7ae86f421788-kube-api-access-xmc2f\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.205261 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df13486a-dbe1-4e12-a17d-7ae86f421788-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.504785 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" event={"ID":"df13486a-dbe1-4e12-a17d-7ae86f421788","Type":"ContainerDied","Data":"e7ae30236bf5586dd8e02d4a4865f53a9e619c1901c87793800f4ee0b2d72da8"} Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.505181 4747 scope.go:117] "RemoveContainer" containerID="4fc3cf09c12099dd9cf67a222f275dcaf5b62b7221f84f7a8145761d5659d977" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.504850 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4f4fb6b4-sptkl" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.508001 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-98878b4f5-vwqb6" event={"ID":"e1c9960b-9312-47ae-b9df-77a915c8ccde","Type":"ContainerDied","Data":"391469927d72fa12c68a383bdb3072a3ae22f628ff76758929c087019b61ad6a"} Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.508062 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-98878b4f5-vwqb6" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.536026 4747 scope.go:117] "RemoveContainer" containerID="df1c2212f0284a81079f6dab19dedbed7fb7f9d00adb1db43b7662a8856937c3" Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.558942 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.568427 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6c4f4fb6b4-sptkl"] Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.578706 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:16 crc kubenswrapper[4747]: I1205 22:26:16.587638 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-98878b4f5-vwqb6"] Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.855719 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" path="/var/lib/kubelet/pods/df13486a-dbe1-4e12-a17d-7ae86f421788/volumes" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.857348 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" path="/var/lib/kubelet/pods/e1c9960b-9312-47ae-b9df-77a915c8ccde/volumes" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.928831 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.929975 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930019 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.930032 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930041 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.930088 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb63f02-f4e5-483d-8070-521435e159f1" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930096 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb63f02-f4e5-483d-8070-521435e159f1" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.930114 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930126 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.930144 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930153 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930681 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930701 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.930722 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef536ee3-bbdb-4c17-b244-c88cdda13d41" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.931149 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.931185 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb63f02-f4e5-483d-8070-521435e159f1" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: E1205 22:26:17.933021 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.933049 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c9960b-9312-47ae-b9df-77a915c8ccde" containerName="heat-api" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.933443 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="df13486a-dbe1-4e12-a17d-7ae86f421788" containerName="heat-cfnapi" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.935939 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:17 crc kubenswrapper[4747]: I1205 22:26:17.948054 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.051636 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ea45-account-create-update-bbll6"] Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.057887 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.058139 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkvv\" (UniqueName: \"kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.058214 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.061152 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-25clt"] Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.069500 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-25clt"] Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.080974 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ea45-account-create-update-bbll6"] Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.160207 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkvv\" (UniqueName: \"kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.160267 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.160341 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.160866 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.160940 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.186384 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkvv\" (UniqueName: \"kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv\") pod \"redhat-operators-zp76h\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.260347 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:18 crc kubenswrapper[4747]: W1205 22:26:18.751834 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291715a4_10ab_445f_84c9_51db14d2feb9.slice/crio-ec63430a030112a5b98775318cd3a40b2f1de633b323c7786fbd1dd05bdc10fc WatchSource:0}: Error finding container ec63430a030112a5b98775318cd3a40b2f1de633b323c7786fbd1dd05bdc10fc: Status 404 returned error can't find the container with id ec63430a030112a5b98775318cd3a40b2f1de633b323c7786fbd1dd05bdc10fc Dec 05 22:26:18 crc kubenswrapper[4747]: I1205 22:26:18.752017 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:19 crc kubenswrapper[4747]: I1205 22:26:19.546913 4747 generic.go:334] "Generic (PLEG): container finished" podID="291715a4-10ab-445f-84c9-51db14d2feb9" containerID="6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987" exitCode=0 Dec 05 22:26:19 crc kubenswrapper[4747]: I1205 22:26:19.546964 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerDied","Data":"6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987"} Dec 05 22:26:19 crc kubenswrapper[4747]: I1205 22:26:19.546995 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerStarted","Data":"ec63430a030112a5b98775318cd3a40b2f1de633b323c7786fbd1dd05bdc10fc"} Dec 05 22:26:19 crc kubenswrapper[4747]: I1205 22:26:19.859052 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="614a026c-be03-4120-9d35-b2f8e097a0ae" path="/var/lib/kubelet/pods/614a026c-be03-4120-9d35-b2f8e097a0ae/volumes" Dec 05 22:26:19 crc kubenswrapper[4747]: I1205 22:26:19.861317 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e06227-0e9f-46ce-92dc-2b780fcb561f" path="/var/lib/kubelet/pods/b6e06227-0e9f-46ce-92dc-2b780fcb561f/volumes" Dec 05 22:26:20 crc kubenswrapper[4747]: I1205 22:26:20.558720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerStarted","Data":"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2"} Dec 05 22:26:21 crc kubenswrapper[4747]: I1205 22:26:21.581468 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Dec 05 22:26:22 crc kubenswrapper[4747]: I1205 22:26:22.578255 4747 generic.go:334] "Generic (PLEG): container finished" podID="291715a4-10ab-445f-84c9-51db14d2feb9" containerID="7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2" exitCode=0 Dec 05 22:26:22 crc kubenswrapper[4747]: I1205 22:26:22.578402 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerDied","Data":"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2"} Dec 05 22:26:22 crc kubenswrapper[4747]: I1205 22:26:22.764513 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7579685f87-4bh5x" Dec 05 22:26:22 crc kubenswrapper[4747]: I1205 22:26:22.813637 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:26:22 crc kubenswrapper[4747]: I1205 22:26:22.813925 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-b947f8656-gpbz8" podUID="3863f9de-d316-444c-9417-41b460fd21c5" containerName="heat-engine" containerID="cri-o://f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" gracePeriod=60 Dec 05 22:26:23 crc kubenswrapper[4747]: I1205 22:26:23.589566 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerStarted","Data":"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75"} Dec 05 22:26:23 crc kubenswrapper[4747]: I1205 22:26:23.619350 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zp76h" podStartSLOduration=2.98761115 podStartE2EDuration="6.619329959s" podCreationTimestamp="2025-12-05 22:26:17 +0000 UTC" firstStartedPulling="2025-12-05 22:26:19.549963714 +0000 UTC m=+6250.017271202" lastFinishedPulling="2025-12-05 22:26:23.181682523 +0000 UTC m=+6253.648990011" observedRunningTime="2025-12-05 22:26:23.613074014 +0000 UTC m=+6254.080381522" watchObservedRunningTime="2025-12-05 22:26:23.619329959 +0000 UTC m=+6254.086637467" Dec 05 22:26:25 crc kubenswrapper[4747]: E1205 22:26:25.671854 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 22:26:25 crc kubenswrapper[4747]: E1205 22:26:25.673799 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 22:26:25 crc kubenswrapper[4747]: E1205 22:26:25.675101 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 22:26:25 crc kubenswrapper[4747]: E1205 22:26:25.675157 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-b947f8656-gpbz8" podUID="3863f9de-d316-444c-9417-41b460fd21c5" containerName="heat-engine" Dec 05 22:26:26 crc kubenswrapper[4747]: I1205 22:26:26.038322 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tnc6l"] Dec 05 22:26:26 crc kubenswrapper[4747]: I1205 22:26:26.047695 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tnc6l"] Dec 05 22:26:27 crc kubenswrapper[4747]: I1205 22:26:27.861952 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e1af63-bbc5-4585-a776-d82e96aeeeb3" path="/var/lib/kubelet/pods/18e1af63-bbc5-4585-a776-d82e96aeeeb3/volumes" Dec 05 22:26:28 crc kubenswrapper[4747]: I1205 22:26:28.261348 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:28 crc kubenswrapper[4747]: I1205 22:26:28.261782 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:29 crc kubenswrapper[4747]: I1205 22:26:29.334881 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zp76h" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="registry-server" probeResult="failure" output=< Dec 05 22:26:29 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:26:29 crc kubenswrapper[4747]: > Dec 05 22:26:29 crc kubenswrapper[4747]: I1205 22:26:29.658545 4747 generic.go:334] "Generic (PLEG): container finished" podID="3863f9de-d316-444c-9417-41b460fd21c5" containerID="f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" exitCode=0 Dec 05 22:26:29 crc kubenswrapper[4747]: I1205 22:26:29.658599 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-b947f8656-gpbz8" event={"ID":"3863f9de-d316-444c-9417-41b460fd21c5","Type":"ContainerDied","Data":"f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab"} Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.038039 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.095220 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jv5\" (UniqueName: \"kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5\") pod \"3863f9de-d316-444c-9417-41b460fd21c5\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.095265 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom\") pod \"3863f9de-d316-444c-9417-41b460fd21c5\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.095424 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle\") pod \"3863f9de-d316-444c-9417-41b460fd21c5\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.095529 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data\") pod \"3863f9de-d316-444c-9417-41b460fd21c5\" (UID: \"3863f9de-d316-444c-9417-41b460fd21c5\") " Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.105866 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3863f9de-d316-444c-9417-41b460fd21c5" (UID: "3863f9de-d316-444c-9417-41b460fd21c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.105958 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5" (OuterVolumeSpecName: "kube-api-access-27jv5") pod "3863f9de-d316-444c-9417-41b460fd21c5" (UID: "3863f9de-d316-444c-9417-41b460fd21c5"). InnerVolumeSpecName "kube-api-access-27jv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.125616 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3863f9de-d316-444c-9417-41b460fd21c5" (UID: "3863f9de-d316-444c-9417-41b460fd21c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.155522 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data" (OuterVolumeSpecName: "config-data") pod "3863f9de-d316-444c-9417-41b460fd21c5" (UID: "3863f9de-d316-444c-9417-41b460fd21c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.198260 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jv5\" (UniqueName: \"kubernetes.io/projected/3863f9de-d316-444c-9417-41b460fd21c5-kube-api-access-27jv5\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.198300 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.198315 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.198328 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3863f9de-d316-444c-9417-41b460fd21c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.674820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-b947f8656-gpbz8" event={"ID":"3863f9de-d316-444c-9417-41b460fd21c5","Type":"ContainerDied","Data":"8a2defa804a5c89910bf9cb69230ff4e1ca03ac0de6622a680544f9c7889ee31"} Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.674885 4747 scope.go:117] "RemoveContainer" containerID="f4f48c125f21d24c0680dd78515088fa84335c88acfe089934aae167944fa6ab" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.674924 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-b947f8656-gpbz8" Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.725979 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:26:30 crc kubenswrapper[4747]: I1205 22:26:30.738171 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-b947f8656-gpbz8"] Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.432218 4747 scope.go:117] "RemoveContainer" containerID="f2be96d521d4474b73bc2ef9d96baa4b8248491bf39497017408c7a4715fd610" Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.472393 4747 scope.go:117] "RemoveContainer" containerID="14ac20e366a5fb83c294d10c95bfbc68cd3d0dd1164f5f8755ababbf12c25114" Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.538424 4747 scope.go:117] "RemoveContainer" containerID="2568597de6c1c8f1d8cd23730cfa1cb1ecc2f87dd71da4df836ede7f8cd035ff" Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.583145 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d6bfb9dd6-f9xzd" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.583315 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:26:31 crc kubenswrapper[4747]: I1205 22:26:31.866210 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3863f9de-d316-444c-9417-41b460fd21c5" path="/var/lib/kubelet/pods/3863f9de-d316-444c-9417-41b460fd21c5/volumes" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.008764 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:35 crc kubenswrapper[4747]: E1205 22:26:35.010142 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3863f9de-d316-444c-9417-41b460fd21c5" containerName="heat-engine" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.010167 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3863f9de-d316-444c-9417-41b460fd21c5" containerName="heat-engine" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.010532 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3863f9de-d316-444c-9417-41b460fd21c5" containerName="heat-engine" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.013968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.021812 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.116161 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.116235 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.116414 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmk9f\" (UniqueName: \"kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.218863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.218990 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmk9f\" (UniqueName: \"kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.219190 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.219805 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.220122 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.243316 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmk9f\" (UniqueName: \"kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f\") pod \"certified-operators-bc4sc\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.352812 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:35 crc kubenswrapper[4747]: I1205 22:26:35.890267 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:36 crc kubenswrapper[4747]: I1205 22:26:36.746795 4747 generic.go:334] "Generic (PLEG): container finished" podID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerID="7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed" exitCode=0 Dec 05 22:26:36 crc kubenswrapper[4747]: I1205 22:26:36.746838 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerDied","Data":"7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed"} Dec 05 22:26:36 crc kubenswrapper[4747]: I1205 22:26:36.747113 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerStarted","Data":"a317fdd1769a05af169b760747f68e1295f6fdf533544eb8d2545312e4b0a722"} Dec 05 22:26:36 crc kubenswrapper[4747]: I1205 22:26:36.750218 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:26:38 crc kubenswrapper[4747]: I1205 22:26:38.316559 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:38 crc kubenswrapper[4747]: I1205 22:26:38.375171 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:38 crc kubenswrapper[4747]: I1205 22:26:38.767648 4747 generic.go:334] "Generic (PLEG): container finished" podID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerID="808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c" exitCode=0 Dec 05 22:26:38 crc kubenswrapper[4747]: I1205 22:26:38.767820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerDied","Data":"808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c"} Dec 05 22:26:39 crc kubenswrapper[4747]: I1205 22:26:39.777383 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:39 crc kubenswrapper[4747]: I1205 22:26:39.781400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerStarted","Data":"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102"} Dec 05 22:26:39 crc kubenswrapper[4747]: I1205 22:26:39.781608 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zp76h" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="registry-server" containerID="cri-o://244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75" gracePeriod=2 Dec 05 22:26:39 crc kubenswrapper[4747]: I1205 22:26:39.811434 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bc4sc" podStartSLOduration=3.406946754 podStartE2EDuration="5.811397636s" podCreationTimestamp="2025-12-05 22:26:34 +0000 UTC" firstStartedPulling="2025-12-05 22:26:36.74993195 +0000 UTC m=+6267.217239428" lastFinishedPulling="2025-12-05 22:26:39.154382812 +0000 UTC m=+6269.621690310" observedRunningTime="2025-12-05 22:26:39.807080739 +0000 UTC m=+6270.274388247" watchObservedRunningTime="2025-12-05 22:26:39.811397636 +0000 UTC m=+6270.278705154" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.263120 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.418320 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities\") pod \"291715a4-10ab-445f-84c9-51db14d2feb9\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.418740 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwkvv\" (UniqueName: \"kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv\") pod \"291715a4-10ab-445f-84c9-51db14d2feb9\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.419005 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content\") pod \"291715a4-10ab-445f-84c9-51db14d2feb9\" (UID: \"291715a4-10ab-445f-84c9-51db14d2feb9\") " Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.419739 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities" (OuterVolumeSpecName: "utilities") pod "291715a4-10ab-445f-84c9-51db14d2feb9" (UID: "291715a4-10ab-445f-84c9-51db14d2feb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.421243 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.424611 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv" (OuterVolumeSpecName: "kube-api-access-dwkvv") pod "291715a4-10ab-445f-84c9-51db14d2feb9" (UID: "291715a4-10ab-445f-84c9-51db14d2feb9"). InnerVolumeSpecName "kube-api-access-dwkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.523151 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwkvv\" (UniqueName: \"kubernetes.io/projected/291715a4-10ab-445f-84c9-51db14d2feb9-kube-api-access-dwkvv\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.533731 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "291715a4-10ab-445f-84c9-51db14d2feb9" (UID: "291715a4-10ab-445f-84c9-51db14d2feb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.625971 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291715a4-10ab-445f-84c9-51db14d2feb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.791813 4747 generic.go:334] "Generic (PLEG): container finished" podID="291715a4-10ab-445f-84c9-51db14d2feb9" containerID="244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75" exitCode=0 Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.791879 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerDied","Data":"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75"} Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.791912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zp76h" event={"ID":"291715a4-10ab-445f-84c9-51db14d2feb9","Type":"ContainerDied","Data":"ec63430a030112a5b98775318cd3a40b2f1de633b323c7786fbd1dd05bdc10fc"} Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.791931 4747 scope.go:117] "RemoveContainer" containerID="244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.791945 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zp76h" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.794412 4747 generic.go:334] "Generic (PLEG): container finished" podID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerID="ecd6b6af989a58fcf969fabaa59cac445b385a5ee906a698bc3c47ac254ebcaf" exitCode=137 Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.794512 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerDied","Data":"ecd6b6af989a58fcf969fabaa59cac445b385a5ee906a698bc3c47ac254ebcaf"} Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.848732 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.864326 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zp76h"] Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.864339 4747 scope.go:117] "RemoveContainer" containerID="7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.913201 4747 scope.go:117] "RemoveContainer" containerID="6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.938610 4747 scope.go:117] "RemoveContainer" containerID="244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75" Dec 05 22:26:40 crc kubenswrapper[4747]: E1205 22:26:40.939200 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75\": container with ID starting with 244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75 not found: ID does not exist" containerID="244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.939346 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75"} err="failed to get container status \"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75\": rpc error: code = NotFound desc = could not find container \"244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75\": container with ID starting with 244141c675e469104b6659d3371b52713872a1b3443dd34bebe983a143372b75 not found: ID does not exist" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.939433 4747 scope.go:117] "RemoveContainer" containerID="7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2" Dec 05 22:26:40 crc kubenswrapper[4747]: E1205 22:26:40.939956 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2\": container with ID starting with 7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2 not found: ID does not exist" containerID="7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.939994 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2"} err="failed to get container status \"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2\": rpc error: code = NotFound desc = could not find container \"7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2\": container with ID starting with 7ac5eb3a772559765bebc2cd588ce1111de80ce3ae84801b27033adfa50f15f2 not found: ID does not exist" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.940022 4747 scope.go:117] "RemoveContainer" containerID="6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987" Dec 05 22:26:40 crc kubenswrapper[4747]: E1205 22:26:40.941725 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987\": container with ID starting with 6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987 not found: ID does not exist" containerID="6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987" Dec 05 22:26:40 crc kubenswrapper[4747]: I1205 22:26:40.941752 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987"} err="failed to get container status \"6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987\": rpc error: code = NotFound desc = could not find container \"6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987\": container with ID starting with 6a4018b26e024d17e7bc0acfe6e0c23483e475030afe87483ee67449ed293987 not found: ID does not exist" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.174431 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.344178 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.344445 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.344744 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.344871 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkdn\" (UniqueName: \"kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.345195 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.345410 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.345633 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key\") pod \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\" (UID: \"48607b00-5383-46cb-9a0e-50f1b1b2cb6c\") " Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.346078 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs" (OuterVolumeSpecName: "logs") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.346640 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.351397 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.352275 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn" (OuterVolumeSpecName: "kube-api-access-xkkdn") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "kube-api-access-xkkdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.372218 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts" (OuterVolumeSpecName: "scripts") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.387735 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.399156 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.404409 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data" (OuterVolumeSpecName: "config-data") pod "48607b00-5383-46cb-9a0e-50f1b1b2cb6c" (UID: "48607b00-5383-46cb-9a0e-50f1b1b2cb6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.448624 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.448886 4747 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.448986 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.449048 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.449118 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.449168 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkdn\" (UniqueName: \"kubernetes.io/projected/48607b00-5383-46cb-9a0e-50f1b1b2cb6c-kube-api-access-xkkdn\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.809832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d6bfb9dd6-f9xzd" event={"ID":"48607b00-5383-46cb-9a0e-50f1b1b2cb6c","Type":"ContainerDied","Data":"f254cf4b67254eadf16eebc84673b58722dfe28443cc5d394160ce6939c6ed13"} Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.810719 4747 scope.go:117] "RemoveContainer" containerID="cca07c67efbe86f856864731c3e7e175e1af7b9e35689055628e804307812f13" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.810884 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d6bfb9dd6-f9xzd" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.865771 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" path="/var/lib/kubelet/pods/291715a4-10ab-445f-84c9-51db14d2feb9/volumes" Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.866684 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.866773 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d6bfb9dd6-f9xzd"] Dec 05 22:26:41 crc kubenswrapper[4747]: I1205 22:26:41.998399 4747 scope.go:117] "RemoveContainer" containerID="ecd6b6af989a58fcf969fabaa59cac445b385a5ee906a698bc3c47ac254ebcaf" Dec 05 22:26:43 crc kubenswrapper[4747]: I1205 22:26:43.855675 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" path="/var/lib/kubelet/pods/48607b00-5383-46cb-9a0e-50f1b1b2cb6c/volumes" Dec 05 22:26:45 crc kubenswrapper[4747]: I1205 22:26:45.353561 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:45 crc kubenswrapper[4747]: I1205 22:26:45.353913 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:45 crc kubenswrapper[4747]: I1205 22:26:45.402419 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:45 crc kubenswrapper[4747]: I1205 22:26:45.914507 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:47 crc kubenswrapper[4747]: I1205 22:26:47.780729 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:47 crc kubenswrapper[4747]: I1205 22:26:47.878009 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bc4sc" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="registry-server" containerID="cri-o://69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102" gracePeriod=2 Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.244144 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx"] Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.245183 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="extract-utilities" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245208 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="extract-utilities" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.245228 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon-log" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245238 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon-log" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.245256 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="extract-content" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245266 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="extract-content" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.245278 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245286 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.245323 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="registry-server" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245333 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="registry-server" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245619 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245645 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="291715a4-10ab-445f-84c9-51db14d2feb9" containerName="registry-server" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.245664 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="48607b00-5383-46cb-9a0e-50f1b1b2cb6c" containerName="horizon-log" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.247713 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.251388 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.254279 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx"] Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.361076 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.420819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97tn7\" (UniqueName: \"kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.421085 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.421146 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.524011 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmk9f\" (UniqueName: \"kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f\") pod \"da62e916-214e-41a2-b28d-63f45ed22d9f\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.524096 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities\") pod \"da62e916-214e-41a2-b28d-63f45ed22d9f\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.524315 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content\") pod \"da62e916-214e-41a2-b28d-63f45ed22d9f\" (UID: \"da62e916-214e-41a2-b28d-63f45ed22d9f\") " Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.524821 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97tn7\" (UniqueName: \"kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.525021 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.525068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.525278 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities" (OuterVolumeSpecName: "utilities") pod "da62e916-214e-41a2-b28d-63f45ed22d9f" (UID: "da62e916-214e-41a2-b28d-63f45ed22d9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.525539 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.525805 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.534720 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f" (OuterVolumeSpecName: "kube-api-access-wmk9f") pod "da62e916-214e-41a2-b28d-63f45ed22d9f" (UID: "da62e916-214e-41a2-b28d-63f45ed22d9f"). InnerVolumeSpecName "kube-api-access-wmk9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.546116 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97tn7\" (UniqueName: \"kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.627926 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmk9f\" (UniqueName: \"kubernetes.io/projected/da62e916-214e-41a2-b28d-63f45ed22d9f-kube-api-access-wmk9f\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.628163 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.659993 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.807138 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da62e916-214e-41a2-b28d-63f45ed22d9f" (UID: "da62e916-214e-41a2-b28d-63f45ed22d9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.835164 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da62e916-214e-41a2-b28d-63f45ed22d9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.889498 4747 generic.go:334] "Generic (PLEG): container finished" podID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerID="69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102" exitCode=0 Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.889541 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerDied","Data":"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102"} Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.889568 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc4sc" event={"ID":"da62e916-214e-41a2-b28d-63f45ed22d9f","Type":"ContainerDied","Data":"a317fdd1769a05af169b760747f68e1295f6fdf533544eb8d2545312e4b0a722"} Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.889598 4747 scope.go:117] "RemoveContainer" containerID="69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.889622 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc4sc" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.914412 4747 scope.go:117] "RemoveContainer" containerID="808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.933602 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.941273 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bc4sc"] Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.955649 4747 scope.go:117] "RemoveContainer" containerID="7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.977156 4747 scope.go:117] "RemoveContainer" containerID="69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.977695 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102\": container with ID starting with 69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102 not found: ID does not exist" containerID="69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.977774 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102"} err="failed to get container status \"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102\": rpc error: code = NotFound desc = could not find container \"69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102\": container with ID starting with 69ff4c3c1d6ae13694e25512ab42f7c5ffa905e7abcbec76e2a962f659cb0102 not found: ID does not exist" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.977816 4747 scope.go:117] "RemoveContainer" containerID="808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.978153 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c\": container with ID starting with 808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c not found: ID does not exist" containerID="808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.978175 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c"} err="failed to get container status \"808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c\": rpc error: code = NotFound desc = could not find container \"808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c\": container with ID starting with 808ba8fe64a3cd55c74c74783830be858fda2b13ac04e627534dd61f1b4f873c not found: ID does not exist" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.978189 4747 scope.go:117] "RemoveContainer" containerID="7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed" Dec 05 22:26:48 crc kubenswrapper[4747]: E1205 22:26:48.978654 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed\": container with ID starting with 7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed not found: ID does not exist" containerID="7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed" Dec 05 22:26:48 crc kubenswrapper[4747]: I1205 22:26:48.978676 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed"} err="failed to get container status \"7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed\": rpc error: code = NotFound desc = could not find container \"7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed\": container with ID starting with 7872032c1037c9f953fd47c7ba40c53ed465b3c047aa87889c808a3469ec60ed not found: ID does not exist" Dec 05 22:26:49 crc kubenswrapper[4747]: I1205 22:26:49.172028 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx"] Dec 05 22:26:49 crc kubenswrapper[4747]: W1205 22:26:49.173961 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd96ced6d_96bd_4f59_af54_33aaba6b3b0a.slice/crio-dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd WatchSource:0}: Error finding container dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd: Status 404 returned error can't find the container with id dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd Dec 05 22:26:49 crc kubenswrapper[4747]: I1205 22:26:49.869303 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" path="/var/lib/kubelet/pods/da62e916-214e-41a2-b28d-63f45ed22d9f/volumes" Dec 05 22:26:49 crc kubenswrapper[4747]: I1205 22:26:49.901457 4747 generic.go:334] "Generic (PLEG): container finished" podID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerID="7ffc2ca0ca7dd8797a66a37d05f3d87f816c35505553e409e00a7f1690480384" exitCode=0 Dec 05 22:26:49 crc kubenswrapper[4747]: I1205 22:26:49.901492 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" event={"ID":"d96ced6d-96bd-4f59-af54-33aaba6b3b0a","Type":"ContainerDied","Data":"7ffc2ca0ca7dd8797a66a37d05f3d87f816c35505553e409e00a7f1690480384"} Dec 05 22:26:49 crc kubenswrapper[4747]: I1205 22:26:49.901532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" event={"ID":"d96ced6d-96bd-4f59-af54-33aaba6b3b0a","Type":"ContainerStarted","Data":"dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd"} Dec 05 22:26:51 crc kubenswrapper[4747]: I1205 22:26:51.929850 4747 generic.go:334] "Generic (PLEG): container finished" podID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerID="4d5e82e1aab4f16155fa98f55742ee8fdf280a71908448c4084d45be37867334" exitCode=0 Dec 05 22:26:51 crc kubenswrapper[4747]: I1205 22:26:51.930007 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" event={"ID":"d96ced6d-96bd-4f59-af54-33aaba6b3b0a","Type":"ContainerDied","Data":"4d5e82e1aab4f16155fa98f55742ee8fdf280a71908448c4084d45be37867334"} Dec 05 22:26:52 crc kubenswrapper[4747]: I1205 22:26:52.943697 4747 generic.go:334] "Generic (PLEG): container finished" podID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerID="7a77a56489dd3099c121d585943a372bebdcda61ba008e3e852e58af7e9f3dd2" exitCode=0 Dec 05 22:26:52 crc kubenswrapper[4747]: I1205 22:26:52.943740 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" event={"ID":"d96ced6d-96bd-4f59-af54-33aaba6b3b0a","Type":"ContainerDied","Data":"7a77a56489dd3099c121d585943a372bebdcda61ba008e3e852e58af7e9f3dd2"} Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.286083 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.465298 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle\") pod \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.466127 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util\") pod \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.466236 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97tn7\" (UniqueName: \"kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7\") pod \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\" (UID: \"d96ced6d-96bd-4f59-af54-33aaba6b3b0a\") " Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.468021 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle" (OuterVolumeSpecName: "bundle") pod "d96ced6d-96bd-4f59-af54-33aaba6b3b0a" (UID: "d96ced6d-96bd-4f59-af54-33aaba6b3b0a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.472214 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7" (OuterVolumeSpecName: "kube-api-access-97tn7") pod "d96ced6d-96bd-4f59-af54-33aaba6b3b0a" (UID: "d96ced6d-96bd-4f59-af54-33aaba6b3b0a"). InnerVolumeSpecName "kube-api-access-97tn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.518819 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util" (OuterVolumeSpecName: "util") pod "d96ced6d-96bd-4f59-af54-33aaba6b3b0a" (UID: "d96ced6d-96bd-4f59-af54-33aaba6b3b0a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.568782 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-util\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.568821 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97tn7\" (UniqueName: \"kubernetes.io/projected/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-kube-api-access-97tn7\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.568837 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d96ced6d-96bd-4f59-af54-33aaba6b3b0a-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.974009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" event={"ID":"d96ced6d-96bd-4f59-af54-33aaba6b3b0a","Type":"ContainerDied","Data":"dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd"} Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.974672 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff38286e09951b4886017b41f8667e3311c6e0a294af5fbd69b1cbc956e1cbd" Dec 05 22:26:54 crc kubenswrapper[4747]: I1205 22:26:54.974151 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.219356 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.219973 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="util" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.219985 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="util" Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.220005 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="extract-content" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220011 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="extract-content" Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.220035 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="pull" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220041 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="pull" Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.220054 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="extract" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220060 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="extract" Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.220071 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="registry-server" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220077 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="registry-server" Dec 05 22:26:58 crc kubenswrapper[4747]: E1205 22:26:58.220084 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="extract-utilities" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220090 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="extract-utilities" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220280 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="da62e916-214e-41a2-b28d-63f45ed22d9f" containerName="registry-server" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.220294 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96ced6d-96bd-4f59-af54-33aaba6b3b0a" containerName="extract" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.221729 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.244150 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.371901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.372005 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pr8\" (UniqueName: \"kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.372032 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.474493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pr8\" (UniqueName: \"kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.474554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.474740 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.475332 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.475951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.499969 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pr8\" (UniqueName: \"kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8\") pod \"redhat-marketplace-z55lk\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:58 crc kubenswrapper[4747]: I1205 22:26:58.540017 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:26:59 crc kubenswrapper[4747]: I1205 22:26:59.077928 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:27:00 crc kubenswrapper[4747]: I1205 22:27:00.062666 4747 generic.go:334] "Generic (PLEG): container finished" podID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerID="ac8ce187dcd86950b13155d29e9aa20480a16d2ff526f8e74c2ed3e7ff206e23" exitCode=0 Dec 05 22:27:00 crc kubenswrapper[4747]: I1205 22:27:00.062706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerDied","Data":"ac8ce187dcd86950b13155d29e9aa20480a16d2ff526f8e74c2ed3e7ff206e23"} Dec 05 22:27:00 crc kubenswrapper[4747]: I1205 22:27:00.062729 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerStarted","Data":"130b4664055d2ed04184a4d702d13e34b6bd6255a119ca48f103479300221b69"} Dec 05 22:27:01 crc kubenswrapper[4747]: I1205 22:27:01.072028 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerStarted","Data":"d33b73ee79f42b350a1eb2c22c7bcc97280db04fb0eb29080cc5f269dce604be"} Dec 05 22:27:02 crc kubenswrapper[4747]: I1205 22:27:02.081564 4747 generic.go:334] "Generic (PLEG): container finished" podID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerID="d33b73ee79f42b350a1eb2c22c7bcc97280db04fb0eb29080cc5f269dce604be" exitCode=0 Dec 05 22:27:02 crc kubenswrapper[4747]: I1205 22:27:02.081649 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerDied","Data":"d33b73ee79f42b350a1eb2c22c7bcc97280db04fb0eb29080cc5f269dce604be"} Dec 05 22:27:03 crc kubenswrapper[4747]: I1205 22:27:03.112734 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerStarted","Data":"6229116f664fd1157a0802becd91780903adf91cdc339ed46dfa1359a4b95774"} Dec 05 22:27:03 crc kubenswrapper[4747]: I1205 22:27:03.134497 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z55lk" podStartSLOduration=2.617589818 podStartE2EDuration="5.134479918s" podCreationTimestamp="2025-12-05 22:26:58 +0000 UTC" firstStartedPulling="2025-12-05 22:27:00.064301986 +0000 UTC m=+6290.531609474" lastFinishedPulling="2025-12-05 22:27:02.581192076 +0000 UTC m=+6293.048499574" observedRunningTime="2025-12-05 22:27:03.133933174 +0000 UTC m=+6293.601240662" watchObservedRunningTime="2025-12-05 22:27:03.134479918 +0000 UTC m=+6293.601787406" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.031527 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.033176 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.035573 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.036261 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.037880 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-8t6j5" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.049630 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.097365 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bz9\" (UniqueName: \"kubernetes.io/projected/6a157993-4e7c-4c79-a126-ffd16308cb25-kube-api-access-f6bz9\") pod \"obo-prometheus-operator-668cf9dfbb-8s8c6\" (UID: \"6a157993-4e7c-4c79-a126-ffd16308cb25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.172441 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.174082 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.179760 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-529l5" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.181065 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.184502 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.195248 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.196527 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.200973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bz9\" (UniqueName: \"kubernetes.io/projected/6a157993-4e7c-4c79-a126-ffd16308cb25-kube-api-access-f6bz9\") pod \"obo-prometheus-operator-668cf9dfbb-8s8c6\" (UID: \"6a157993-4e7c-4c79-a126-ffd16308cb25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.204036 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.238293 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bz9\" (UniqueName: \"kubernetes.io/projected/6a157993-4e7c-4c79-a126-ffd16308cb25-kube-api-access-f6bz9\") pod \"obo-prometheus-operator-668cf9dfbb-8s8c6\" (UID: \"6a157993-4e7c-4c79-a126-ffd16308cb25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.303615 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.303687 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.303725 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.303773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.308894 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5vf5m"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.310762 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.313150 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-8qbp6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.314056 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.325855 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5vf5m"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407281 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407313 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407418 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg25m\" (UniqueName: \"kubernetes.io/projected/67f34afc-ce90-42ca-8599-60e931bb3868-kube-api-access-tg25m\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.407738 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f34afc-ce90-42ca-8599-60e931bb3868-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.415596 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.425530 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.433802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15183649-4b35-40ef-a6b7-2b1c786246b9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f\" (UID: \"15183649-4b35-40ef-a6b7-2b1c786246b9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.434311 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.438180 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww\" (UID: \"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.493639 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-sp22q"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.495072 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.500125 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.506053 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-f5s4p" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.510800 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f34afc-ce90-42ca-8599-60e931bb3868-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.511691 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg25m\" (UniqueName: \"kubernetes.io/projected/67f34afc-ce90-42ca-8599-60e931bb3868-kube-api-access-tg25m\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.519879 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f34afc-ce90-42ca-8599-60e931bb3868-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.532099 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-sp22q"] Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.539132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg25m\" (UniqueName: \"kubernetes.io/projected/67f34afc-ce90-42ca-8599-60e931bb3868-kube-api-access-tg25m\") pod \"observability-operator-d8bb48f5d-5vf5m\" (UID: \"67f34afc-ce90-42ca-8599-60e931bb3868\") " pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.576064 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.615081 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-openshift-service-ca\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.615419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f447q\" (UniqueName: \"kubernetes.io/projected/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-kube-api-access-f447q\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.637065 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.719442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-openshift-service-ca\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.719517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f447q\" (UniqueName: \"kubernetes.io/projected/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-kube-api-access-f447q\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.720779 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-openshift-service-ca\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.767361 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f447q\" (UniqueName: \"kubernetes.io/projected/29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01-kube-api-access-f447q\") pod \"perses-operator-5446b9c989-sp22q\" (UID: \"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01\") " pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:04 crc kubenswrapper[4747]: I1205 22:27:04.982068 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:05 crc kubenswrapper[4747]: I1205 22:27:05.126847 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww"] Dec 05 22:27:05 crc kubenswrapper[4747]: I1205 22:27:05.139090 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6"] Dec 05 22:27:05 crc kubenswrapper[4747]: I1205 22:27:05.156405 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f"] Dec 05 22:27:05 crc kubenswrapper[4747]: I1205 22:27:05.459635 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5vf5m"] Dec 05 22:27:05 crc kubenswrapper[4747]: I1205 22:27:05.491340 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-sp22q"] Dec 05 22:27:06 crc kubenswrapper[4747]: I1205 22:27:06.143183 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" event={"ID":"15183649-4b35-40ef-a6b7-2b1c786246b9","Type":"ContainerStarted","Data":"8657c3966989a8b8e9e31f1524cb4850f695e22b81da00fc2d7f25c162e808ce"} Dec 05 22:27:06 crc kubenswrapper[4747]: I1205 22:27:06.145158 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" event={"ID":"67f34afc-ce90-42ca-8599-60e931bb3868","Type":"ContainerStarted","Data":"f8495edf080c31f6f63ddadc53f2fbeffd27f60cb8af6b00d3650e6cfcc0a75b"} Dec 05 22:27:06 crc kubenswrapper[4747]: I1205 22:27:06.146290 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" event={"ID":"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3","Type":"ContainerStarted","Data":"b73d7963779f6ecd34e50c8e9120ba05fe0790ff1d00ef2dd9d741807fd55217"} Dec 05 22:27:06 crc kubenswrapper[4747]: I1205 22:27:06.147787 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" event={"ID":"6a157993-4e7c-4c79-a126-ffd16308cb25","Type":"ContainerStarted","Data":"01f3697be7e1262cb75987fa0b8c7154f8e17ee592c238ad97e9ff4b6176083a"} Dec 05 22:27:06 crc kubenswrapper[4747]: I1205 22:27:06.148949 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-sp22q" event={"ID":"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01","Type":"ContainerStarted","Data":"0402a90595cbf757c39e17e933b2aeba7b3e671e50f806471a4e95efa99ccec3"} Dec 05 22:27:08 crc kubenswrapper[4747]: I1205 22:27:08.542811 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:08 crc kubenswrapper[4747]: I1205 22:27:08.543301 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:08 crc kubenswrapper[4747]: I1205 22:27:08.603313 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:09 crc kubenswrapper[4747]: I1205 22:27:09.292503 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:10 crc kubenswrapper[4747]: I1205 22:27:10.976981 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:27:11 crc kubenswrapper[4747]: I1205 22:27:11.235329 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z55lk" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="registry-server" containerID="cri-o://6229116f664fd1157a0802becd91780903adf91cdc339ed46dfa1359a4b95774" gracePeriod=2 Dec 05 22:27:12 crc kubenswrapper[4747]: I1205 22:27:12.249795 4747 generic.go:334] "Generic (PLEG): container finished" podID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerID="6229116f664fd1157a0802becd91780903adf91cdc339ed46dfa1359a4b95774" exitCode=0 Dec 05 22:27:12 crc kubenswrapper[4747]: I1205 22:27:12.250108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerDied","Data":"6229116f664fd1157a0802becd91780903adf91cdc339ed46dfa1359a4b95774"} Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.855700 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.945803 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29pr8\" (UniqueName: \"kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8\") pod \"58beac16-db0e-48b0-bb65-35dcc76bdb56\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.945963 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities\") pod \"58beac16-db0e-48b0-bb65-35dcc76bdb56\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.945994 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content\") pod \"58beac16-db0e-48b0-bb65-35dcc76bdb56\" (UID: \"58beac16-db0e-48b0-bb65-35dcc76bdb56\") " Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.948185 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities" (OuterVolumeSpecName: "utilities") pod "58beac16-db0e-48b0-bb65-35dcc76bdb56" (UID: "58beac16-db0e-48b0-bb65-35dcc76bdb56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.973915 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8" (OuterVolumeSpecName: "kube-api-access-29pr8") pod "58beac16-db0e-48b0-bb65-35dcc76bdb56" (UID: "58beac16-db0e-48b0-bb65-35dcc76bdb56"). InnerVolumeSpecName "kube-api-access-29pr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:27:13 crc kubenswrapper[4747]: I1205 22:27:13.978225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58beac16-db0e-48b0-bb65-35dcc76bdb56" (UID: "58beac16-db0e-48b0-bb65-35dcc76bdb56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.048318 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.048347 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58beac16-db0e-48b0-bb65-35dcc76bdb56-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.048358 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29pr8\" (UniqueName: \"kubernetes.io/projected/58beac16-db0e-48b0-bb65-35dcc76bdb56-kube-api-access-29pr8\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.272781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" event={"ID":"67f34afc-ce90-42ca-8599-60e931bb3868","Type":"ContainerStarted","Data":"979a2464197aa2e2293cfb9c11f11e7b548a07b8e1ad62012eb10c85adbf4696"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.274302 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.275504 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" event={"ID":"d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3","Type":"ContainerStarted","Data":"be7d8d4874657dcad30fcfc1a97ac13871672366d93c66ff26897d04497b3778"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.276532 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.278486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z55lk" event={"ID":"58beac16-db0e-48b0-bb65-35dcc76bdb56","Type":"ContainerDied","Data":"130b4664055d2ed04184a4d702d13e34b6bd6255a119ca48f103479300221b69"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.278526 4747 scope.go:117] "RemoveContainer" containerID="6229116f664fd1157a0802becd91780903adf91cdc339ed46dfa1359a4b95774" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.278664 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z55lk" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.286311 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" event={"ID":"6a157993-4e7c-4c79-a126-ffd16308cb25","Type":"ContainerStarted","Data":"c8a5413d6e1e03ed3ebe9a9c49f3994591a9a85a7375c2d00ff8545c932138a5"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.289915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-sp22q" event={"ID":"29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01","Type":"ContainerStarted","Data":"2408671d5ca91e9e0fba904c0d7f1498f618c7217454e81e09879c9e4d6507ef"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.290156 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.292207 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" event={"ID":"15183649-4b35-40ef-a6b7-2b1c786246b9","Type":"ContainerStarted","Data":"566d7446f6d167c2aa7d1fc76fae169bdc1a5b148115d77d46d4ae23d060a9d7"} Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.306533 4747 scope.go:117] "RemoveContainer" containerID="d33b73ee79f42b350a1eb2c22c7bcc97280db04fb0eb29080cc5f269dce604be" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.306953 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-5vf5m" podStartSLOduration=2.170044902 podStartE2EDuration="10.306941778s" podCreationTimestamp="2025-12-05 22:27:04 +0000 UTC" firstStartedPulling="2025-12-05 22:27:05.454870157 +0000 UTC m=+6295.922177645" lastFinishedPulling="2025-12-05 22:27:13.591767033 +0000 UTC m=+6304.059074521" observedRunningTime="2025-12-05 22:27:14.295449093 +0000 UTC m=+6304.762756591" watchObservedRunningTime="2025-12-05 22:27:14.306941778 +0000 UTC m=+6304.774249266" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.367313 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww" podStartSLOduration=2.083476477 podStartE2EDuration="10.367295954s" podCreationTimestamp="2025-12-05 22:27:04 +0000 UTC" firstStartedPulling="2025-12-05 22:27:05.143844879 +0000 UTC m=+6295.611152367" lastFinishedPulling="2025-12-05 22:27:13.427664336 +0000 UTC m=+6303.894971844" observedRunningTime="2025-12-05 22:27:14.355310957 +0000 UTC m=+6304.822618455" watchObservedRunningTime="2025-12-05 22:27:14.367295954 +0000 UTC m=+6304.834603442" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.393979 4747 scope.go:117] "RemoveContainer" containerID="ac8ce187dcd86950b13155d29e9aa20480a16d2ff526f8e74c2ed3e7ff206e23" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.408321 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8s8c6" podStartSLOduration=2.075257314 podStartE2EDuration="10.40829424s" podCreationTimestamp="2025-12-05 22:27:04 +0000 UTC" firstStartedPulling="2025-12-05 22:27:05.162765368 +0000 UTC m=+6295.630072856" lastFinishedPulling="2025-12-05 22:27:13.495802294 +0000 UTC m=+6303.963109782" observedRunningTime="2025-12-05 22:27:14.394376395 +0000 UTC m=+6304.861683903" watchObservedRunningTime="2025-12-05 22:27:14.40829424 +0000 UTC m=+6304.875601728" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.440854 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-sp22q" podStartSLOduration=2.450982096 podStartE2EDuration="10.440833066s" podCreationTimestamp="2025-12-05 22:27:04 +0000 UTC" firstStartedPulling="2025-12-05 22:27:05.504019206 +0000 UTC m=+6295.971326694" lastFinishedPulling="2025-12-05 22:27:13.493870166 +0000 UTC m=+6303.961177664" observedRunningTime="2025-12-05 22:27:14.438894988 +0000 UTC m=+6304.906202476" watchObservedRunningTime="2025-12-05 22:27:14.440833066 +0000 UTC m=+6304.908140554" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.506223 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f" podStartSLOduration=2.279573776 podStartE2EDuration="10.506198536s" podCreationTimestamp="2025-12-05 22:27:04 +0000 UTC" firstStartedPulling="2025-12-05 22:27:05.210372648 +0000 UTC m=+6295.677680136" lastFinishedPulling="2025-12-05 22:27:13.436997408 +0000 UTC m=+6303.904304896" observedRunningTime="2025-12-05 22:27:14.485857512 +0000 UTC m=+6304.953165000" watchObservedRunningTime="2025-12-05 22:27:14.506198536 +0000 UTC m=+6304.973506024" Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.519418 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:27:14 crc kubenswrapper[4747]: I1205 22:27:14.533363 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z55lk"] Dec 05 22:27:15 crc kubenswrapper[4747]: I1205 22:27:15.855568 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" path="/var/lib/kubelet/pods/58beac16-db0e-48b0-bb65-35dcc76bdb56/volumes" Dec 05 22:27:24 crc kubenswrapper[4747]: I1205 22:27:24.988837 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-sp22q" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.416934 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.417788 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" containerName="openstackclient" containerID="cri-o://4b35cd217d29af685a661724277f692ab3128f1e51e80878f2b33fb91ac7b314" gracePeriod=2 Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.434962 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.461148 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 22:27:27 crc kubenswrapper[4747]: E1205 22:27:27.461858 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="registry-server" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.461950 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="registry-server" Dec 05 22:27:27 crc kubenswrapper[4747]: E1205 22:27:27.462041 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="extract-content" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.462092 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="extract-content" Dec 05 22:27:27 crc kubenswrapper[4747]: E1205 22:27:27.462147 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="extract-utilities" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.462194 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="extract-utilities" Dec 05 22:27:27 crc kubenswrapper[4747]: E1205 22:27:27.462255 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" containerName="openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.462302 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" containerName="openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.462546 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" containerName="openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.462637 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="58beac16-db0e-48b0-bb65-35dcc76bdb56" containerName="registry-server" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.463357 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.469253 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.541406 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.543979 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.544033 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.544120 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbtk\" (UniqueName: \"kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.544158 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.644534 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.646091 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbtk\" (UniqueName: \"kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.646230 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.646372 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.646533 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.646674 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.647475 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.651769 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l7lt7" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.653470 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.656115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.665510 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.670376 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbtk\" (UniqueName: \"kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk\") pod \"openstackclient\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.754084 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2pp\" (UniqueName: \"kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp\") pod \"kube-state-metrics-0\" (UID: \"c2cc595b-3fc7-4d0b-9faa-399d45d73efc\") " pod="openstack/kube-state-metrics-0" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.780314 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.859494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2pp\" (UniqueName: \"kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp\") pod \"kube-state-metrics-0\" (UID: \"c2cc595b-3fc7-4d0b-9faa-399d45d73efc\") " pod="openstack/kube-state-metrics-0" Dec 05 22:27:27 crc kubenswrapper[4747]: I1205 22:27:27.911815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2pp\" (UniqueName: \"kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp\") pod \"kube-state-metrics-0\" (UID: \"c2cc595b-3fc7-4d0b-9faa-399d45d73efc\") " pod="openstack/kube-state-metrics-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.094152 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.610993 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.613440 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.617399 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.617548 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.617672 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.617909 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.619562 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-2nrvg" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.627659 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 22:27:28 crc kubenswrapper[4747]: W1205 22:27:28.694408 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod970e936d_dcfb_42c2_82fc_6a996d7f138d.slice/crio-1bd597c51bd064d81ed97d68060796b0477d619c3f853366c5717a00ce0fffd7 WatchSource:0}: Error finding container 1bd597c51bd064d81ed97d68060796b0477d619c3f853366c5717a00ce0fffd7: Status 404 returned error can't find the container with id 1bd597c51bd064d81ed97d68060796b0477d619c3f853366c5717a00ce0fffd7 Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.694924 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.694979 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.695088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.695142 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.695170 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmb5d\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-kube-api-access-bmb5d\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.695260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.695307 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.753990 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797074 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797183 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797227 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmb5d\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-kube-api-access-bmb5d\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797271 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797295 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797364 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.797396 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.804622 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.805782 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.811935 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.813559 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.813567 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbdbcfa-768c-4314-ac05-643065b2c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.813933 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbdbcfa-768c-4314-ac05-643065b2c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.904693 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmb5d\" (UniqueName: \"kubernetes.io/projected/adbdbcfa-768c-4314-ac05-643065b2c85b-kube-api-access-bmb5d\") pod \"alertmanager-metric-storage-0\" (UID: \"adbdbcfa-768c-4314-ac05-643065b2c85b\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.920224 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:27:28 crc kubenswrapper[4747]: I1205 22:27:28.945734 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.171969 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.174515 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.180482 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.180779 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-55452" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.180785 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.180926 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.190352 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.194703 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207226 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207273 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207310 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7b5f\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207399 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207437 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.207542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.224193 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.309810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7b5f\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.309854 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.309890 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.309982 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.310002 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.310017 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.310049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.310090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.310805 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.321146 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.321211 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.322310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.325958 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.334166 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.356414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7b5f\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.470527 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.470604 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3fb55531289291714ede682c00a25c49f851403111a2750d6fdb7718b5d72857/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.516908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2cc595b-3fc7-4d0b-9faa-399d45d73efc","Type":"ContainerStarted","Data":"63d8875157247f63774b7863338001e6cf89025e02b948d94090f2b73499536b"} Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.529749 4747 generic.go:334] "Generic (PLEG): container finished" podID="b0df9ca5-9db8-4c89-b160-154db180c7cc" containerID="4b35cd217d29af685a661724277f692ab3128f1e51e80878f2b33fb91ac7b314" exitCode=137 Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.531202 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"970e936d-dcfb-42c2-82fc-6a996d7f138d","Type":"ContainerStarted","Data":"1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847"} Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.531232 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"970e936d-dcfb-42c2-82fc-6a996d7f138d","Type":"ContainerStarted","Data":"1bd597c51bd064d81ed97d68060796b0477d619c3f853366c5717a00ce0fffd7"} Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.563783 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.563765134 podStartE2EDuration="2.563765134s" podCreationTimestamp="2025-12-05 22:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:27:29.562122193 +0000 UTC m=+6320.029429681" watchObservedRunningTime="2025-12-05 22:27:29.563765134 +0000 UTC m=+6320.031072622" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.635039 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.793415 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:27:29 crc kubenswrapper[4747]: W1205 22:27:29.889287 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbdbcfa_768c_4314_ac05_643065b2c85b.slice/crio-4b8856e07e03e4d36898c2824fe1e6bb056f7d3276e58348ae6d951d123cf60d WatchSource:0}: Error finding container 4b8856e07e03e4d36898c2824fe1e6bb056f7d3276e58348ae6d951d123cf60d: Status 404 returned error can't find the container with id 4b8856e07e03e4d36898c2824fe1e6bb056f7d3276e58348ae6d951d123cf60d Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.892540 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.992602 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:27:29 crc kubenswrapper[4747]: I1205 22:27:29.995910 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.045627 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle\") pod \"b0df9ca5-9db8-4c89-b160-154db180c7cc\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.045678 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qblg\" (UniqueName: \"kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg\") pod \"b0df9ca5-9db8-4c89-b160-154db180c7cc\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.045705 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret\") pod \"b0df9ca5-9db8-4c89-b160-154db180c7cc\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.045836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config\") pod \"b0df9ca5-9db8-4c89-b160-154db180c7cc\" (UID: \"b0df9ca5-9db8-4c89-b160-154db180c7cc\") " Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.072998 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg" (OuterVolumeSpecName: "kube-api-access-9qblg") pod "b0df9ca5-9db8-4c89-b160-154db180c7cc" (UID: "b0df9ca5-9db8-4c89-b160-154db180c7cc"). InnerVolumeSpecName "kube-api-access-9qblg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.079147 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0df9ca5-9db8-4c89-b160-154db180c7cc" (UID: "b0df9ca5-9db8-4c89-b160-154db180c7cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.105921 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b0df9ca5-9db8-4c89-b160-154db180c7cc" (UID: "b0df9ca5-9db8-4c89-b160-154db180c7cc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.132346 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b0df9ca5-9db8-4c89-b160-154db180c7cc" (UID: "b0df9ca5-9db8-4c89-b160-154db180c7cc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.147942 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.147983 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.147992 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qblg\" (UniqueName: \"kubernetes.io/projected/b0df9ca5-9db8-4c89-b160-154db180c7cc-kube-api-access-9qblg\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.148001 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0df9ca5-9db8-4c89-b160-154db180c7cc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 22:27:30 crc kubenswrapper[4747]: W1205 22:27:30.406491 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32daf65_886d_49c4_a8ac_45e3cf6a8999.slice/crio-6e17f985fff7317a30c523ac984493479a7ad7b18d837bf2ceb27cc272564735 WatchSource:0}: Error finding container 6e17f985fff7317a30c523ac984493479a7ad7b18d837bf2ceb27cc272564735: Status 404 returned error can't find the container with id 6e17f985fff7317a30c523ac984493479a7ad7b18d837bf2ceb27cc272564735 Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.410181 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.541389 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerStarted","Data":"6e17f985fff7317a30c523ac984493479a7ad7b18d837bf2ceb27cc272564735"} Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.553982 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2cc595b-3fc7-4d0b-9faa-399d45d73efc","Type":"ContainerStarted","Data":"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a"} Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.554264 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.558937 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"adbdbcfa-768c-4314-ac05-643065b2c85b","Type":"ContainerStarted","Data":"4b8856e07e03e4d36898c2824fe1e6bb056f7d3276e58348ae6d951d123cf60d"} Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.561520 4747 scope.go:117] "RemoveContainer" containerID="4b35cd217d29af685a661724277f692ab3128f1e51e80878f2b33fb91ac7b314" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.561527 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.575320 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.585592 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" Dec 05 22:27:30 crc kubenswrapper[4747]: I1205 22:27:30.589473 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.087649119 podStartE2EDuration="3.589441975s" podCreationTimestamp="2025-12-05 22:27:27 +0000 UTC" firstStartedPulling="2025-12-05 22:27:28.956262718 +0000 UTC m=+6319.423570206" lastFinishedPulling="2025-12-05 22:27:29.458055574 +0000 UTC m=+6319.925363062" observedRunningTime="2025-12-05 22:27:30.571433819 +0000 UTC m=+6321.038741307" watchObservedRunningTime="2025-12-05 22:27:30.589441975 +0000 UTC m=+6321.056749463" Dec 05 22:27:31 crc kubenswrapper[4747]: I1205 22:27:31.854467 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0df9ca5-9db8-4c89-b160-154db180c7cc" path="/var/lib/kubelet/pods/b0df9ca5-9db8-4c89-b160-154db180c7cc/volumes" Dec 05 22:27:32 crc kubenswrapper[4747]: I1205 22:27:32.030148 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ndzms"] Dec 05 22:27:32 crc kubenswrapper[4747]: I1205 22:27:32.045988 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b40a-account-create-update-qvwpj"] Dec 05 22:27:32 crc kubenswrapper[4747]: I1205 22:27:32.054810 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ndzms"] Dec 05 22:27:32 crc kubenswrapper[4747]: I1205 22:27:32.063279 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b40a-account-create-update-qvwpj"] Dec 05 22:27:33 crc kubenswrapper[4747]: I1205 22:27:33.851525 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e" path="/var/lib/kubelet/pods/403e6b20-c7e9-47f0-8d6a-e1b9a5a6e57e/volumes" Dec 05 22:27:33 crc kubenswrapper[4747]: I1205 22:27:33.963466 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6483ce8-8e70-4b43-985e-3b631c83588f" path="/var/lib/kubelet/pods/f6483ce8-8e70-4b43-985e-3b631c83588f/volumes" Dec 05 22:27:36 crc kubenswrapper[4747]: I1205 22:27:36.221819 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:27:36 crc kubenswrapper[4747]: I1205 22:27:36.222404 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:27:38 crc kubenswrapper[4747]: I1205 22:27:38.105312 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 22:27:39 crc kubenswrapper[4747]: I1205 22:27:39.717808 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"adbdbcfa-768c-4314-ac05-643065b2c85b","Type":"ContainerStarted","Data":"4e6e0e2cea8cb03340ba8842548d38eb1e1b4216279d8853ef77f80d7bc13828"} Dec 05 22:27:39 crc kubenswrapper[4747]: I1205 22:27:39.720283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerStarted","Data":"8d4ea786111b51ab78e9d187490f9926b710a0310e23c562f8413a4d5d76fed7"} Dec 05 22:27:41 crc kubenswrapper[4747]: I1205 22:27:41.032696 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lh6ts"] Dec 05 22:27:41 crc kubenswrapper[4747]: I1205 22:27:41.041844 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lh6ts"] Dec 05 22:27:41 crc kubenswrapper[4747]: I1205 22:27:41.852098 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ba03f2-64bd-4d96-88de-1de3b7bdaa07" path="/var/lib/kubelet/pods/87ba03f2-64bd-4d96-88de-1de3b7bdaa07/volumes" Dec 05 22:27:45 crc kubenswrapper[4747]: I1205 22:27:45.830604 4747 generic.go:334] "Generic (PLEG): container finished" podID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerID="8d4ea786111b51ab78e9d187490f9926b710a0310e23c562f8413a4d5d76fed7" exitCode=0 Dec 05 22:27:45 crc kubenswrapper[4747]: I1205 22:27:45.831250 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerDied","Data":"8d4ea786111b51ab78e9d187490f9926b710a0310e23c562f8413a4d5d76fed7"} Dec 05 22:27:45 crc kubenswrapper[4747]: I1205 22:27:45.834856 4747 generic.go:334] "Generic (PLEG): container finished" podID="adbdbcfa-768c-4314-ac05-643065b2c85b" containerID="4e6e0e2cea8cb03340ba8842548d38eb1e1b4216279d8853ef77f80d7bc13828" exitCode=0 Dec 05 22:27:45 crc kubenswrapper[4747]: I1205 22:27:45.834907 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"adbdbcfa-768c-4314-ac05-643065b2c85b","Type":"ContainerDied","Data":"4e6e0e2cea8cb03340ba8842548d38eb1e1b4216279d8853ef77f80d7bc13828"} Dec 05 22:27:48 crc kubenswrapper[4747]: I1205 22:27:48.882326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"adbdbcfa-768c-4314-ac05-643065b2c85b","Type":"ContainerStarted","Data":"7bccf0e3627a5a271fdb3a227b60a25240e4ffaebe89f2563cd0f12357e0b90f"} Dec 05 22:27:51 crc kubenswrapper[4747]: I1205 22:27:51.916787 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"adbdbcfa-768c-4314-ac05-643065b2c85b","Type":"ContainerStarted","Data":"5a5c1d150962d0fc5a399f3fbb2405ecf4a8f984c6f678c5c0a391a4b1dfbfa0"} Dec 05 22:27:51 crc kubenswrapper[4747]: I1205 22:27:51.918885 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:51 crc kubenswrapper[4747]: I1205 22:27:51.922891 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 05 22:27:51 crc kubenswrapper[4747]: I1205 22:27:51.977748 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.963897878 podStartE2EDuration="23.977717815s" podCreationTimestamp="2025-12-05 22:27:28 +0000 UTC" firstStartedPulling="2025-12-05 22:27:29.894154423 +0000 UTC m=+6320.361461911" lastFinishedPulling="2025-12-05 22:27:47.90797436 +0000 UTC m=+6338.375281848" observedRunningTime="2025-12-05 22:27:51.937811876 +0000 UTC m=+6342.405119364" watchObservedRunningTime="2025-12-05 22:27:51.977717815 +0000 UTC m=+6342.445025313" Dec 05 22:27:55 crc kubenswrapper[4747]: I1205 22:27:55.969406 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerStarted","Data":"887438b36e34a593cdb6e8a6d7d775287d7c63c332baf23e80ddfaa9941bc2aa"} Dec 05 22:28:02 crc kubenswrapper[4747]: I1205 22:28:02.053622 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerStarted","Data":"e0145b5de16f0399ee495b2b2286f5560975aead1d74dce5b5c983af32ad908b"} Dec 05 22:28:05 crc kubenswrapper[4747]: I1205 22:28:05.093226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerStarted","Data":"e2acb4503fd4467c438755ae2b61c3a5c6dcba77aa259a30cd2bb5fe80490688"} Dec 05 22:28:05 crc kubenswrapper[4747]: I1205 22:28:05.120594 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.483789881 podStartE2EDuration="37.120555598s" podCreationTimestamp="2025-12-05 22:27:28 +0000 UTC" firstStartedPulling="2025-12-05 22:27:30.409038874 +0000 UTC m=+6320.876346362" lastFinishedPulling="2025-12-05 22:28:04.045804591 +0000 UTC m=+6354.513112079" observedRunningTime="2025-12-05 22:28:05.11535911 +0000 UTC m=+6355.582666628" watchObservedRunningTime="2025-12-05 22:28:05.120555598 +0000 UTC m=+6355.587863096" Dec 05 22:28:06 crc kubenswrapper[4747]: I1205 22:28:06.221890 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:28:06 crc kubenswrapper[4747]: I1205 22:28:06.222316 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:28:09 crc kubenswrapper[4747]: I1205 22:28:09.794405 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:10 crc kubenswrapper[4747]: I1205 22:28:10.052923 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-874a-account-create-update-2pb5w"] Dec 05 22:28:10 crc kubenswrapper[4747]: I1205 22:28:10.067677 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rfdwp"] Dec 05 22:28:10 crc kubenswrapper[4747]: I1205 22:28:10.096182 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-874a-account-create-update-2pb5w"] Dec 05 22:28:10 crc kubenswrapper[4747]: I1205 22:28:10.112164 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rfdwp"] Dec 05 22:28:11 crc kubenswrapper[4747]: I1205 22:28:11.863122 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b08df84-d2c1-4977-af04-d21e169dfefc" path="/var/lib/kubelet/pods/0b08df84-d2c1-4977-af04-d21e169dfefc/volumes" Dec 05 22:28:11 crc kubenswrapper[4747]: I1205 22:28:11.867429 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd5a427-7846-4f98-bdef-c3b116a97cbb" path="/var/lib/kubelet/pods/dbd5a427-7846-4f98-bdef-c3b116a97cbb/volumes" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.794329 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.799082 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.816366 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.820641 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.823340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.823341 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.848213 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.970828 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.970911 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.970961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.970987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qbc\" (UniqueName: \"kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.971360 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.971477 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:14 crc kubenswrapper[4747]: I1205 22:28:14.971650 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.073798 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.073902 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.073941 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074113 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074163 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qbc\" (UniqueName: \"kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074285 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074817 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.074888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.080835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.081683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.091308 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.091572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.092477 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qbc\" (UniqueName: \"kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc\") pod \"ceilometer-0\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.152322 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.235036 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:15 crc kubenswrapper[4747]: I1205 22:28:15.670378 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:16 crc kubenswrapper[4747]: I1205 22:28:16.032502 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-srhb7"] Dec 05 22:28:16 crc kubenswrapper[4747]: I1205 22:28:16.043699 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-srhb7"] Dec 05 22:28:16 crc kubenswrapper[4747]: I1205 22:28:16.243498 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerStarted","Data":"f4b949ca6bbff78f3cf4a22d29a6f78384682a73944ae4984ea8ca073ece53cd"} Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.255386 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerStarted","Data":"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc"} Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.337735 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.338001 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" containerName="openstackclient" containerID="cri-o://1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847" gracePeriod=2 Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.351248 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.409469 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 22:28:17 crc kubenswrapper[4747]: E1205 22:28:17.410485 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" containerName="openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.410519 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" containerName="openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.411098 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" containerName="openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.412703 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.426660 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" podUID="fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.433658 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.528200 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.528316 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.528392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.528540 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfb4d\" (UniqueName: \"kubernetes.io/projected/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-kube-api-access-bfb4d\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.630905 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.631042 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.631105 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.631208 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfb4d\" (UniqueName: \"kubernetes.io/projected/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-kube-api-access-bfb4d\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.632608 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.635364 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.635999 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.646362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfb4d\" (UniqueName: \"kubernetes.io/projected/fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5-kube-api-access-bfb4d\") pod \"openstackclient\" (UID: \"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5\") " pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.754049 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:28:17 crc kubenswrapper[4747]: I1205 22:28:17.860917 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b81b11-0fb6-48aa-ad62-8129aeedbd7d" path="/var/lib/kubelet/pods/c3b81b11-0fb6-48aa-ad62-8129aeedbd7d/volumes" Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.265014 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerStarted","Data":"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c"} Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.265334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerStarted","Data":"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046"} Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.373250 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.418975 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.419476 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="prometheus" containerID="cri-o://887438b36e34a593cdb6e8a6d7d775287d7c63c332baf23e80ddfaa9941bc2aa" gracePeriod=600 Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.419678 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="thanos-sidecar" containerID="cri-o://e2acb4503fd4467c438755ae2b61c3a5c6dcba77aa259a30cd2bb5fe80490688" gracePeriod=600 Dec 05 22:28:18 crc kubenswrapper[4747]: I1205 22:28:18.419834 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="config-reloader" containerID="cri-o://e0145b5de16f0399ee495b2b2286f5560975aead1d74dce5b5c983af32ad908b" gracePeriod=600 Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.280882 4747 generic.go:334] "Generic (PLEG): container finished" podID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerID="e2acb4503fd4467c438755ae2b61c3a5c6dcba77aa259a30cd2bb5fe80490688" exitCode=0 Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.282395 4747 generic.go:334] "Generic (PLEG): container finished" podID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerID="e0145b5de16f0399ee495b2b2286f5560975aead1d74dce5b5c983af32ad908b" exitCode=0 Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.282411 4747 generic.go:334] "Generic (PLEG): container finished" podID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerID="887438b36e34a593cdb6e8a6d7d775287d7c63c332baf23e80ddfaa9941bc2aa" exitCode=0 Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.281415 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerDied","Data":"e2acb4503fd4467c438755ae2b61c3a5c6dcba77aa259a30cd2bb5fe80490688"} Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.282514 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerDied","Data":"e0145b5de16f0399ee495b2b2286f5560975aead1d74dce5b5c983af32ad908b"} Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.282527 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerDied","Data":"887438b36e34a593cdb6e8a6d7d775287d7c63c332baf23e80ddfaa9941bc2aa"} Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.289816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5","Type":"ContainerStarted","Data":"e500d681f413a1e6de210ef5011feef7747737e41862dff976afba92820e8d9d"} Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.289847 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5","Type":"ContainerStarted","Data":"d7876cbf409e14d70f957c6c7f9cc979354ab39c7330759a0f6bfad31ceb2557"} Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.325966 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.325946486 podStartE2EDuration="2.325946486s" podCreationTimestamp="2025-12-05 22:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:28:19.304501095 +0000 UTC m=+6369.771808583" watchObservedRunningTime="2025-12-05 22:28:19.325946486 +0000 UTC m=+6369.793253974" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.458853 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466470 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466544 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466709 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466809 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7b5f\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.466916 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.469134 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.469296 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out\") pod \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\" (UID: \"c32daf65-886d-49c4-a8ac-45e3cf6a8999\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.471023 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.474549 4747 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c32daf65-886d-49c4-a8ac-45e3cf6a8999-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.520848 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.533905 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f" (OuterVolumeSpecName: "kube-api-access-p7b5f") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "kube-api-access-p7b5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.535532 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out" (OuterVolumeSpecName: "config-out") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.546105 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config" (OuterVolumeSpecName: "config") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.546422 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config" (OuterVolumeSpecName: "web-config") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.553305 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576702 4747 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576741 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7b5f\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-kube-api-access-p7b5f\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576755 4747 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c32daf65-886d-49c4-a8ac-45e3cf6a8999-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576767 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576777 4747 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c32daf65-886d-49c4-a8ac-45e3cf6a8999-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.576790 4747 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c32daf65-886d-49c4-a8ac-45e3cf6a8999-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.585634 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c32daf65-886d-49c4-a8ac-45e3cf6a8999" (UID: "c32daf65-886d-49c4-a8ac-45e3cf6a8999"). InnerVolumeSpecName "pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.678495 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") on node \"crc\" " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.757691 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.758163 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d") on node "crc" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.780348 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.875555 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.996462 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle\") pod \"970e936d-dcfb-42c2-82fc-6a996d7f138d\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.996592 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrbtk\" (UniqueName: \"kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk\") pod \"970e936d-dcfb-42c2-82fc-6a996d7f138d\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.996613 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret\") pod \"970e936d-dcfb-42c2-82fc-6a996d7f138d\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " Dec 05 22:28:19 crc kubenswrapper[4747]: I1205 22:28:19.996649 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config\") pod \"970e936d-dcfb-42c2-82fc-6a996d7f138d\" (UID: \"970e936d-dcfb-42c2-82fc-6a996d7f138d\") " Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.035806 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk" (OuterVolumeSpecName: "kube-api-access-nrbtk") pod "970e936d-dcfb-42c2-82fc-6a996d7f138d" (UID: "970e936d-dcfb-42c2-82fc-6a996d7f138d"). InnerVolumeSpecName "kube-api-access-nrbtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.077785 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "970e936d-dcfb-42c2-82fc-6a996d7f138d" (UID: "970e936d-dcfb-42c2-82fc-6a996d7f138d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.082159 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "970e936d-dcfb-42c2-82fc-6a996d7f138d" (UID: "970e936d-dcfb-42c2-82fc-6a996d7f138d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.104049 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrbtk\" (UniqueName: \"kubernetes.io/projected/970e936d-dcfb-42c2-82fc-6a996d7f138d-kube-api-access-nrbtk\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.104084 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.104097 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.125725 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "970e936d-dcfb-42c2-82fc-6a996d7f138d" (UID: "970e936d-dcfb-42c2-82fc-6a996d7f138d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.205965 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/970e936d-dcfb-42c2-82fc-6a996d7f138d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.317943 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerStarted","Data":"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e"} Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.318328 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.331772 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c32daf65-886d-49c4-a8ac-45e3cf6a8999","Type":"ContainerDied","Data":"6e17f985fff7317a30c523ac984493479a7ad7b18d837bf2ceb27cc272564735"} Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.331821 4747 scope.go:117] "RemoveContainer" containerID="e2acb4503fd4467c438755ae2b61c3a5c6dcba77aa259a30cd2bb5fe80490688" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.331984 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.347508 4747 generic.go:334] "Generic (PLEG): container finished" podID="970e936d-dcfb-42c2-82fc-6a996d7f138d" containerID="1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847" exitCode=137 Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.347900 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.348876 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9685387 podStartE2EDuration="6.348864149s" podCreationTimestamp="2025-12-05 22:28:14 +0000 UTC" firstStartedPulling="2025-12-05 22:28:15.665178887 +0000 UTC m=+6366.132486375" lastFinishedPulling="2025-12-05 22:28:19.045504336 +0000 UTC m=+6369.512811824" observedRunningTime="2025-12-05 22:28:20.34448246 +0000 UTC m=+6370.811789968" watchObservedRunningTime="2025-12-05 22:28:20.348864149 +0000 UTC m=+6370.816171647" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.369281 4747 scope.go:117] "RemoveContainer" containerID="e0145b5de16f0399ee495b2b2286f5560975aead1d74dce5b5c983af32ad908b" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.371370 4747 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" podUID="fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.375896 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.397228 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.402356 4747 scope.go:117] "RemoveContainer" containerID="887438b36e34a593cdb6e8a6d7d775287d7c63c332baf23e80ddfaa9941bc2aa" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.409347 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:20 crc kubenswrapper[4747]: E1205 22:28:20.409772 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="config-reloader" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.409789 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="config-reloader" Dec 05 22:28:20 crc kubenswrapper[4747]: E1205 22:28:20.409805 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="thanos-sidecar" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.409813 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="thanos-sidecar" Dec 05 22:28:20 crc kubenswrapper[4747]: E1205 22:28:20.409827 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="prometheus" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.409832 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="prometheus" Dec 05 22:28:20 crc kubenswrapper[4747]: E1205 22:28:20.409848 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="init-config-reloader" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.409854 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="init-config-reloader" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.410037 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="config-reloader" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.410066 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="thanos-sidecar" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.410080 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" containerName="prometheus" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.421538 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.425364 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.427534 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.431351 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-55452" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.432556 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.433850 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.435296 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.441810 4747 scope.go:117] "RemoveContainer" containerID="8d4ea786111b51ab78e9d187490f9926b710a0310e23c562f8413a4d5d76fed7" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.468342 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.498307 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.510755 4747 scope.go:117] "RemoveContainer" containerID="1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.519903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.519958 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520005 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwb4q\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-kube-api-access-fwb4q\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520038 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520116 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520219 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47e6cc39-27de-45da-92cf-2a13effd0974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520257 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.520430 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47e6cc39-27de-45da-92cf-2a13effd0974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.541757 4747 scope.go:117] "RemoveContainer" containerID="1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847" Dec 05 22:28:20 crc kubenswrapper[4747]: E1205 22:28:20.544820 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847\": container with ID starting with 1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847 not found: ID does not exist" containerID="1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.544859 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847"} err="failed to get container status \"1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847\": rpc error: code = NotFound desc = could not find container \"1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847\": container with ID starting with 1729b92ef5dba0a9c8a034ee61a4d93ebb8af48284f00f989e96a11e1f689847 not found: ID does not exist" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623301 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwb4q\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-kube-api-access-fwb4q\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623509 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47e6cc39-27de-45da-92cf-2a13effd0974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623644 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623678 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623743 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.623815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47e6cc39-27de-45da-92cf-2a13effd0974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.624998 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/47e6cc39-27de-45da-92cf-2a13effd0974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.628485 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.630717 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.630849 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.630883 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3fb55531289291714ede682c00a25c49f851403111a2750d6fdb7718b5d72857/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.630997 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.631499 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.633494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47e6cc39-27de-45da-92cf-2a13effd0974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.634491 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.639478 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6cc39-27de-45da-92cf-2a13effd0974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.639510 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.643950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwb4q\" (UniqueName: \"kubernetes.io/projected/47e6cc39-27de-45da-92cf-2a13effd0974-kube-api-access-fwb4q\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.693305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb17cfa8-8863-4bd5-9003-fca9a831c52d\") pod \"prometheus-metric-storage-0\" (UID: \"47e6cc39-27de-45da-92cf-2a13effd0974\") " pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:20 crc kubenswrapper[4747]: I1205 22:28:20.782682 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:21 crc kubenswrapper[4747]: I1205 22:28:21.300991 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 22:28:21 crc kubenswrapper[4747]: I1205 22:28:21.359213 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerStarted","Data":"3a9a23a482699649d62794447c34cb0818369cf18180a635bd7658069515ca9b"} Dec 05 22:28:21 crc kubenswrapper[4747]: I1205 22:28:21.852204 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970e936d-dcfb-42c2-82fc-6a996d7f138d" path="/var/lib/kubelet/pods/970e936d-dcfb-42c2-82fc-6a996d7f138d/volumes" Dec 05 22:28:21 crc kubenswrapper[4747]: I1205 22:28:21.853860 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32daf65-886d-49c4-a8ac-45e3cf6a8999" path="/var/lib/kubelet/pods/c32daf65-886d-49c4-a8ac-45e3cf6a8999/volumes" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.288013 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-p2l24"] Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.289994 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.300007 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-p2l24"] Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.386932 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-d7a4-account-create-update-khnhr"] Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.388557 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.390900 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.404846 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.404906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8jc\" (UniqueName: \"kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.406091 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-d7a4-account-create-update-khnhr"] Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.507118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.507164 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzkj\" (UniqueName: \"kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.507205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8jc\" (UniqueName: \"kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.507224 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.507905 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.523233 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8jc\" (UniqueName: \"kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc\") pod \"aodh-db-create-p2l24\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.608909 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzkj\" (UniqueName: \"kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.608972 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.609876 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.625810 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.627228 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzkj\" (UniqueName: \"kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj\") pod \"aodh-d7a4-account-create-update-khnhr\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:24 crc kubenswrapper[4747]: I1205 22:28:24.713977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.209923 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-p2l24"] Dec 05 22:28:25 crc kubenswrapper[4747]: W1205 22:28:25.213998 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66cf0233_b5cf_4f1f_abdc_a852279804f4.slice/crio-f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa WatchSource:0}: Error finding container f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa: Status 404 returned error can't find the container with id f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.322539 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-d7a4-account-create-update-khnhr"] Dec 05 22:28:25 crc kubenswrapper[4747]: W1205 22:28:25.331253 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06bcd857_d3bb_4c46_a694_512effef5dd4.slice/crio-2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59 WatchSource:0}: Error finding container 2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59: Status 404 returned error can't find the container with id 2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59 Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.431221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d7a4-account-create-update-khnhr" event={"ID":"06bcd857-d3bb-4c46-a694-512effef5dd4","Type":"ContainerStarted","Data":"2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59"} Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.434363 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-p2l24" event={"ID":"66cf0233-b5cf-4f1f-abdc-a852279804f4","Type":"ContainerStarted","Data":"fa5c51acd66ecd9ff96c2e4e735a5c82813ce878804f8de7afc3dca52df932dc"} Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.434391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-p2l24" event={"ID":"66cf0233-b5cf-4f1f-abdc-a852279804f4","Type":"ContainerStarted","Data":"f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa"} Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.437254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerStarted","Data":"cbbf0d6d7e6af556a26855a5f1bed0589352ea367eccebed4e67e38aeb2ebc75"} Dec 05 22:28:25 crc kubenswrapper[4747]: I1205 22:28:25.457466 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-p2l24" podStartSLOduration=1.45743922 podStartE2EDuration="1.45743922s" podCreationTimestamp="2025-12-05 22:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:28:25.447927424 +0000 UTC m=+6375.915234912" watchObservedRunningTime="2025-12-05 22:28:25.45743922 +0000 UTC m=+6375.924746708" Dec 05 22:28:26 crc kubenswrapper[4747]: I1205 22:28:26.446524 4747 generic.go:334] "Generic (PLEG): container finished" podID="66cf0233-b5cf-4f1f-abdc-a852279804f4" containerID="fa5c51acd66ecd9ff96c2e4e735a5c82813ce878804f8de7afc3dca52df932dc" exitCode=0 Dec 05 22:28:26 crc kubenswrapper[4747]: I1205 22:28:26.446635 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-p2l24" event={"ID":"66cf0233-b5cf-4f1f-abdc-a852279804f4","Type":"ContainerDied","Data":"fa5c51acd66ecd9ff96c2e4e735a5c82813ce878804f8de7afc3dca52df932dc"} Dec 05 22:28:26 crc kubenswrapper[4747]: I1205 22:28:26.448669 4747 generic.go:334] "Generic (PLEG): container finished" podID="06bcd857-d3bb-4c46-a694-512effef5dd4" containerID="317e3c661021bd1e2e29166b6603e09ead137e6dfe57d94ebb3575e3a8b28b48" exitCode=0 Dec 05 22:28:26 crc kubenswrapper[4747]: I1205 22:28:26.448756 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d7a4-account-create-update-khnhr" event={"ID":"06bcd857-d3bb-4c46-a694-512effef5dd4","Type":"ContainerDied","Data":"317e3c661021bd1e2e29166b6603e09ead137e6dfe57d94ebb3575e3a8b28b48"} Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.936873 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.988371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wzkj\" (UniqueName: \"kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj\") pod \"06bcd857-d3bb-4c46-a694-512effef5dd4\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.988614 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts\") pod \"06bcd857-d3bb-4c46-a694-512effef5dd4\" (UID: \"06bcd857-d3bb-4c46-a694-512effef5dd4\") " Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.989046 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06bcd857-d3bb-4c46-a694-512effef5dd4" (UID: "06bcd857-d3bb-4c46-a694-512effef5dd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.990344 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bcd857-d3bb-4c46-a694-512effef5dd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:27 crc kubenswrapper[4747]: I1205 22:28:27.996854 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj" (OuterVolumeSpecName: "kube-api-access-6wzkj") pod "06bcd857-d3bb-4c46-a694-512effef5dd4" (UID: "06bcd857-d3bb-4c46-a694-512effef5dd4"). InnerVolumeSpecName "kube-api-access-6wzkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.060735 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.091770 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts\") pod \"66cf0233-b5cf-4f1f-abdc-a852279804f4\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.092153 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8jc\" (UniqueName: \"kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc\") pod \"66cf0233-b5cf-4f1f-abdc-a852279804f4\" (UID: \"66cf0233-b5cf-4f1f-abdc-a852279804f4\") " Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.092381 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66cf0233-b5cf-4f1f-abdc-a852279804f4" (UID: "66cf0233-b5cf-4f1f-abdc-a852279804f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.092708 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wzkj\" (UniqueName: \"kubernetes.io/projected/06bcd857-d3bb-4c46-a694-512effef5dd4-kube-api-access-6wzkj\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.092726 4747 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66cf0233-b5cf-4f1f-abdc-a852279804f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.096006 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc" (OuterVolumeSpecName: "kube-api-access-qv8jc") pod "66cf0233-b5cf-4f1f-abdc-a852279804f4" (UID: "66cf0233-b5cf-4f1f-abdc-a852279804f4"). InnerVolumeSpecName "kube-api-access-qv8jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.195123 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv8jc\" (UniqueName: \"kubernetes.io/projected/66cf0233-b5cf-4f1f-abdc-a852279804f4-kube-api-access-qv8jc\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.471730 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d7a4-account-create-update-khnhr" event={"ID":"06bcd857-d3bb-4c46-a694-512effef5dd4","Type":"ContainerDied","Data":"2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59"} Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.471774 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be9b822f354a8774f8c920feb456c2fc8ceab417d862da9a3db7b9c016bfe59" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.472035 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d7a4-account-create-update-khnhr" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.473614 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-p2l24" event={"ID":"66cf0233-b5cf-4f1f-abdc-a852279804f4","Type":"ContainerDied","Data":"f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa"} Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.473640 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7497e2fffbdce628bfa8874a10fcd2dd4fe6b21425e260a8f0e53f8c9795caa" Dec 05 22:28:28 crc kubenswrapper[4747]: I1205 22:28:28.473695 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-p2l24" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.801971 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-qgtp5"] Dec 05 22:28:29 crc kubenswrapper[4747]: E1205 22:28:29.802402 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cf0233-b5cf-4f1f-abdc-a852279804f4" containerName="mariadb-database-create" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.802416 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cf0233-b5cf-4f1f-abdc-a852279804f4" containerName="mariadb-database-create" Dec 05 22:28:29 crc kubenswrapper[4747]: E1205 22:28:29.802432 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06bcd857-d3bb-4c46-a694-512effef5dd4" containerName="mariadb-account-create-update" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.802440 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="06bcd857-d3bb-4c46-a694-512effef5dd4" containerName="mariadb-account-create-update" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.802651 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cf0233-b5cf-4f1f-abdc-a852279804f4" containerName="mariadb-database-create" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.802681 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="06bcd857-d3bb-4c46-a694-512effef5dd4" containerName="mariadb-account-create-update" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.803386 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.806042 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.806326 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l284b" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.806393 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.806578 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.820176 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qgtp5"] Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.832464 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrnnk\" (UniqueName: \"kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.832552 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.832650 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.832929 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.934940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.935266 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrnnk\" (UniqueName: \"kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.935318 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.935354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.939880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.940001 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.940659 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:29 crc kubenswrapper[4747]: I1205 22:28:29.953166 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrnnk\" (UniqueName: \"kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk\") pod \"aodh-db-sync-qgtp5\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:30 crc kubenswrapper[4747]: I1205 22:28:30.124419 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:30 crc kubenswrapper[4747]: I1205 22:28:30.655342 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qgtp5"] Dec 05 22:28:30 crc kubenswrapper[4747]: W1205 22:28:30.661840 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f270eed_adab_4d18_86c6_6321ad34b6a8.slice/crio-067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773 WatchSource:0}: Error finding container 067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773: Status 404 returned error can't find the container with id 067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773 Dec 05 22:28:31 crc kubenswrapper[4747]: I1205 22:28:31.537291 4747 generic.go:334] "Generic (PLEG): container finished" podID="47e6cc39-27de-45da-92cf-2a13effd0974" containerID="cbbf0d6d7e6af556a26855a5f1bed0589352ea367eccebed4e67e38aeb2ebc75" exitCode=0 Dec 05 22:28:31 crc kubenswrapper[4747]: I1205 22:28:31.537436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerDied","Data":"cbbf0d6d7e6af556a26855a5f1bed0589352ea367eccebed4e67e38aeb2ebc75"} Dec 05 22:28:31 crc kubenswrapper[4747]: I1205 22:28:31.543862 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgtp5" event={"ID":"8f270eed-adab-4d18-86c6-6321ad34b6a8","Type":"ContainerStarted","Data":"067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773"} Dec 05 22:28:32 crc kubenswrapper[4747]: I1205 22:28:32.556503 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerStarted","Data":"f9e8b4a3828d83d8c440b66ab789eef690c4cb8b0953c9cd7279a3e6ec56a973"} Dec 05 22:28:34 crc kubenswrapper[4747]: I1205 22:28:34.234184 4747 scope.go:117] "RemoveContainer" containerID="9ffe59837d62ec8f4aa84d9e2e77ba1a6c99e76f1e1116e3c92c892b26204df1" Dec 05 22:28:34 crc kubenswrapper[4747]: I1205 22:28:34.841814 4747 scope.go:117] "RemoveContainer" containerID="eed5babd274f0abea116e74a4abe9e7044743d3049a79e92554731a28de88584" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.105556 4747 scope.go:117] "RemoveContainer" containerID="a294e56c662222030f2f29a7c59796902d82ef2a9c4a5fb15e7655f8dfbc548a" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.289482 4747 scope.go:117] "RemoveContainer" containerID="fef82e6b857efccab3057e2381e13c9cccc7bf0dd1b7846f13cf564007c84274" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.360026 4747 scope.go:117] "RemoveContainer" containerID="07e7f3cfca0a21fa1397f6d900099523df25a689fec7eb18818130c146287456" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.391687 4747 scope.go:117] "RemoveContainer" containerID="5780bb4339707268fa0197f40f371dadececeb8c895835934ab6a083469d2fbe" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.419254 4747 scope.go:117] "RemoveContainer" containerID="6093288aa5dedcd69c55f5aafb9a09e1af0eb72d68aede62c3ada1d066881e24" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.445551 4747 scope.go:117] "RemoveContainer" containerID="d2aef1d5fbaf49a6cfb48c9312ef3a14a04d8bac9ac9e5add62ab3db2a2f104d" Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.656825 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgtp5" event={"ID":"8f270eed-adab-4d18-86c6-6321ad34b6a8","Type":"ContainerStarted","Data":"17eb2e573028a36c93811fdaa10534b20ccdea666e6cb9a3e938b8d4a895f58f"} Dec 05 22:28:35 crc kubenswrapper[4747]: I1205 22:28:35.691209 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-qgtp5" podStartSLOduration=2.249079641 podStartE2EDuration="6.691185725s" podCreationTimestamp="2025-12-05 22:28:29 +0000 UTC" firstStartedPulling="2025-12-05 22:28:30.664097123 +0000 UTC m=+6381.131404611" lastFinishedPulling="2025-12-05 22:28:35.106203197 +0000 UTC m=+6385.573510695" observedRunningTime="2025-12-05 22:28:35.680048069 +0000 UTC m=+6386.147355577" watchObservedRunningTime="2025-12-05 22:28:35.691185725 +0000 UTC m=+6386.158493233" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.222201 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.222271 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.222332 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.223270 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.223347 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" gracePeriod=600 Dec 05 22:28:36 crc kubenswrapper[4747]: E1205 22:28:36.358481 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.680128 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" exitCode=0 Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.680178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f"} Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.680263 4747 scope.go:117] "RemoveContainer" containerID="0e963d14d45c9a8e403bf5219ea245778f9eb613a6ce7acf2a23ab111935a921" Dec 05 22:28:36 crc kubenswrapper[4747]: I1205 22:28:36.686111 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:28:36 crc kubenswrapper[4747]: E1205 22:28:36.687297 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:28:37 crc kubenswrapper[4747]: I1205 22:28:37.695165 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerStarted","Data":"319546ee9985a76b448534849e0d154e3d88f3ce8b542f730d9e8b7acc337ec3"} Dec 05 22:28:37 crc kubenswrapper[4747]: I1205 22:28:37.695988 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"47e6cc39-27de-45da-92cf-2a13effd0974","Type":"ContainerStarted","Data":"805b0a7d6d84fc46adfe55fc3d4829faf09d9b1f417a84fc79a74beba05933f3"} Dec 05 22:28:37 crc kubenswrapper[4747]: I1205 22:28:37.697773 4747 generic.go:334] "Generic (PLEG): container finished" podID="8f270eed-adab-4d18-86c6-6321ad34b6a8" containerID="17eb2e573028a36c93811fdaa10534b20ccdea666e6cb9a3e938b8d4a895f58f" exitCode=0 Dec 05 22:28:37 crc kubenswrapper[4747]: I1205 22:28:37.697809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgtp5" event={"ID":"8f270eed-adab-4d18-86c6-6321ad34b6a8","Type":"ContainerDied","Data":"17eb2e573028a36c93811fdaa10534b20ccdea666e6cb9a3e938b8d4a895f58f"} Dec 05 22:28:37 crc kubenswrapper[4747]: I1205 22:28:37.748101 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.748077893 podStartE2EDuration="17.748077893s" podCreationTimestamp="2025-12-05 22:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:28:37.733078741 +0000 UTC m=+6388.200386259" watchObservedRunningTime="2025-12-05 22:28:37.748077893 +0000 UTC m=+6388.215385401" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.113378 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.190525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle\") pod \"8f270eed-adab-4d18-86c6-6321ad34b6a8\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.190761 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data\") pod \"8f270eed-adab-4d18-86c6-6321ad34b6a8\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.190806 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrnnk\" (UniqueName: \"kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk\") pod \"8f270eed-adab-4d18-86c6-6321ad34b6a8\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.190836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts\") pod \"8f270eed-adab-4d18-86c6-6321ad34b6a8\" (UID: \"8f270eed-adab-4d18-86c6-6321ad34b6a8\") " Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.197742 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts" (OuterVolumeSpecName: "scripts") pod "8f270eed-adab-4d18-86c6-6321ad34b6a8" (UID: "8f270eed-adab-4d18-86c6-6321ad34b6a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.203291 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk" (OuterVolumeSpecName: "kube-api-access-hrnnk") pod "8f270eed-adab-4d18-86c6-6321ad34b6a8" (UID: "8f270eed-adab-4d18-86c6-6321ad34b6a8"). InnerVolumeSpecName "kube-api-access-hrnnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.229137 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data" (OuterVolumeSpecName: "config-data") pod "8f270eed-adab-4d18-86c6-6321ad34b6a8" (UID: "8f270eed-adab-4d18-86c6-6321ad34b6a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.238761 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f270eed-adab-4d18-86c6-6321ad34b6a8" (UID: "8f270eed-adab-4d18-86c6-6321ad34b6a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.293287 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.293540 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.293551 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrnnk\" (UniqueName: \"kubernetes.io/projected/8f270eed-adab-4d18-86c6-6321ad34b6a8-kube-api-access-hrnnk\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.293561 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f270eed-adab-4d18-86c6-6321ad34b6a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.721337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qgtp5" event={"ID":"8f270eed-adab-4d18-86c6-6321ad34b6a8","Type":"ContainerDied","Data":"067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773"} Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.721375 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067241ea3281f1418ac5482ccb290ca038c8fa6bc215f94abf1f15e76a923773" Dec 05 22:28:39 crc kubenswrapper[4747]: I1205 22:28:39.721435 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qgtp5" Dec 05 22:28:40 crc kubenswrapper[4747]: I1205 22:28:40.783237 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.390934 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 22:28:44 crc kubenswrapper[4747]: E1205 22:28:44.392080 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f270eed-adab-4d18-86c6-6321ad34b6a8" containerName="aodh-db-sync" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.392098 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f270eed-adab-4d18-86c6-6321ad34b6a8" containerName="aodh-db-sync" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.392279 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f270eed-adab-4d18-86c6-6321ad34b6a8" containerName="aodh-db-sync" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.394178 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.403814 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.404278 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l284b" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.404352 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.435325 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.441525 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.441596 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.441649 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.441673 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpvw\" (UniqueName: \"kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.544144 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.544198 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.544241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.544264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpvw\" (UniqueName: \"kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.550318 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.551106 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.552048 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.562937 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpvw\" (UniqueName: \"kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw\") pod \"aodh-0\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " pod="openstack/aodh-0" Dec 05 22:28:44 crc kubenswrapper[4747]: I1205 22:28:44.744806 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:28:45 crc kubenswrapper[4747]: I1205 22:28:45.166055 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 22:28:45 crc kubenswrapper[4747]: I1205 22:28:45.303154 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 22:28:45 crc kubenswrapper[4747]: I1205 22:28:45.809660 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerStarted","Data":"9ee5e93ff0e6e89f7abd0e8ad1b69a14b21c505b50a5c11c65635f0ea45708c8"} Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.639691 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.641049 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-central-agent" containerID="cri-o://31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc" gracePeriod=30 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.641113 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-notification-agent" containerID="cri-o://413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046" gracePeriod=30 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.641133 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="sg-core" containerID="cri-o://db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c" gracePeriod=30 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.641188 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="proxy-httpd" containerID="cri-o://bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e" gracePeriod=30 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.837033 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerID="bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e" exitCode=0 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.837071 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerID="db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c" exitCode=2 Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.837080 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerDied","Data":"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e"} Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.837145 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerDied","Data":"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c"} Dec 05 22:28:46 crc kubenswrapper[4747]: I1205 22:28:46.839422 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerStarted","Data":"ef1340dd16291d83a5c7396f616cac259a1936b1c01156081d5548c42e5ff21e"} Dec 05 22:28:47 crc kubenswrapper[4747]: I1205 22:28:47.194282 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 22:28:47 crc kubenswrapper[4747]: I1205 22:28:47.853987 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerID="31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc" exitCode=0 Dec 05 22:28:47 crc kubenswrapper[4747]: I1205 22:28:47.854246 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerDied","Data":"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc"} Dec 05 22:28:47 crc kubenswrapper[4747]: I1205 22:28:47.857235 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerStarted","Data":"285ddf6116465e20b0c7bd82ce6e77ae914f4b65dab5b80da0e3e91378d0ba43"} Dec 05 22:28:48 crc kubenswrapper[4747]: I1205 22:28:48.866992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerStarted","Data":"1664c855fe070884813f748436aec22a812c5fe17cc81e903b4dbf4af4f00f13"} Dec 05 22:28:50 crc kubenswrapper[4747]: I1205 22:28:50.783463 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:50 crc kubenswrapper[4747]: I1205 22:28:50.808062 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:50 crc kubenswrapper[4747]: I1205 22:28:50.839814 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:28:50 crc kubenswrapper[4747]: E1205 22:28:50.840143 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:28:50 crc kubenswrapper[4747]: I1205 22:28:50.941895 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.615135 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.735885 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736225 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8qbc\" (UniqueName: \"kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736256 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736295 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736318 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736476 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736504 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.736843 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.737077 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.737089 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.741775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts" (OuterVolumeSpecName: "scripts") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.749734 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc" (OuterVolumeSpecName: "kube-api-access-v8qbc") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "kube-api-access-v8qbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.780666 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.837927 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.847791 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") pod \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\" (UID: \"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1\") " Dec 05 22:28:51 crc kubenswrapper[4747]: W1205 22:28:51.847898 4747 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1/volumes/kubernetes.io~secret/combined-ca-bundle Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.847921 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.848656 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8qbc\" (UniqueName: \"kubernetes.io/projected/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-kube-api-access-v8qbc\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.848681 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.848694 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.848704 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.869133 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data" (OuterVolumeSpecName: "config-data") pod "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" (UID: "d8c94cb0-78e0-4ae9-8cef-46ca05644ca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.931143 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerStarted","Data":"7fb5a982607debcdc7aef3df15135b63ec5ecf2c8e5fcb4207918a8f162c85e1"} Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.931268 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-api" containerID="cri-o://ef1340dd16291d83a5c7396f616cac259a1936b1c01156081d5548c42e5ff21e" gracePeriod=30 Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.931310 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-listener" containerID="cri-o://7fb5a982607debcdc7aef3df15135b63ec5ecf2c8e5fcb4207918a8f162c85e1" gracePeriod=30 Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.931355 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-notifier" containerID="cri-o://1664c855fe070884813f748436aec22a812c5fe17cc81e903b4dbf4af4f00f13" gracePeriod=30 Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.931485 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-evaluator" containerID="cri-o://285ddf6116465e20b0c7bd82ce6e77ae914f4b65dab5b80da0e3e91378d0ba43" gracePeriod=30 Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.934964 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerID="413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046" exitCode=0 Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.936038 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.936973 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerDied","Data":"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046"} Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.937044 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8c94cb0-78e0-4ae9-8cef-46ca05644ca1","Type":"ContainerDied","Data":"f4b949ca6bbff78f3cf4a22d29a6f78384682a73944ae4984ea8ca073ece53cd"} Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.937201 4747 scope.go:117] "RemoveContainer" containerID="bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.955453 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:51 crc kubenswrapper[4747]: I1205 22:28:51.973837 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.281227511 podStartE2EDuration="7.973810336s" podCreationTimestamp="2025-12-05 22:28:44 +0000 UTC" firstStartedPulling="2025-12-05 22:28:45.290972328 +0000 UTC m=+6395.758279816" lastFinishedPulling="2025-12-05 22:28:50.983555153 +0000 UTC m=+6401.450862641" observedRunningTime="2025-12-05 22:28:51.953894412 +0000 UTC m=+6402.421201890" watchObservedRunningTime="2025-12-05 22:28:51.973810336 +0000 UTC m=+6402.441117854" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.012009 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.015300 4747 scope.go:117] "RemoveContainer" containerID="db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.028225 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.044439 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.044956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="proxy-httpd" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.044974 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="proxy-httpd" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.044999 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-notification-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045006 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-notification-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.045028 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-central-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045033 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-central-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.045044 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="sg-core" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045050 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="sg-core" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045234 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="proxy-httpd" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045249 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-notification-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045263 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="ceilometer-central-agent" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.045275 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" containerName="sg-core" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.047400 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.052935 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.053960 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.056749 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.091394 4747 scope.go:117] "RemoveContainer" containerID="413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.119164 4747 scope.go:117] "RemoveContainer" containerID="31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.142472 4747 scope.go:117] "RemoveContainer" containerID="bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.144496 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e\": container with ID starting with bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e not found: ID does not exist" containerID="bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.144534 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e"} err="failed to get container status \"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e\": rpc error: code = NotFound desc = could not find container \"bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e\": container with ID starting with bd6c97deb317300a0e181250eae03e0a92ff995c66ae54c219057de20a9f005e not found: ID does not exist" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.144553 4747 scope.go:117] "RemoveContainer" containerID="db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.144797 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c\": container with ID starting with db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c not found: ID does not exist" containerID="db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.144816 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c"} err="failed to get container status \"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c\": rpc error: code = NotFound desc = could not find container \"db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c\": container with ID starting with db32fe34165a8705333a1b5f579cd2f05d537b6b6533e095ec31ed6bda61093c not found: ID does not exist" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.144828 4747 scope.go:117] "RemoveContainer" containerID="413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.144978 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046\": container with ID starting with 413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046 not found: ID does not exist" containerID="413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.144997 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046"} err="failed to get container status \"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046\": rpc error: code = NotFound desc = could not find container \"413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046\": container with ID starting with 413b050dfc7e88e02ef9e5ef5a72460cf029183266197b64d42ff005a2497046 not found: ID does not exist" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.145009 4747 scope.go:117] "RemoveContainer" containerID="31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc" Dec 05 22:28:52 crc kubenswrapper[4747]: E1205 22:28:52.145145 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc\": container with ID starting with 31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc not found: ID does not exist" containerID="31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.145163 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc"} err="failed to get container status \"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc\": rpc error: code = NotFound desc = could not find container \"31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc\": container with ID starting with 31ca6bf0c671a3984d051d0102a42ebe11caf70a75588341bce22096847d45cc not found: ID does not exist" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158425 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158493 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158706 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158833 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158887 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trrx2\" (UniqueName: \"kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.158998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.159077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260694 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260775 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260816 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260853 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.260878 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trrx2\" (UniqueName: \"kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.261993 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.262220 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.267424 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.267821 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.270455 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.270524 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.278690 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trrx2\" (UniqueName: \"kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2\") pod \"ceilometer-0\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.399444 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.894475 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:52 crc kubenswrapper[4747]: W1205 22:28:52.906641 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa00771_6f36_432d_b2a7_566f34c88db1.slice/crio-5ba1374ccb1257aed3acde11ab13cc086485a962a76d651487a9361df8d5f7cb WatchSource:0}: Error finding container 5ba1374ccb1257aed3acde11ab13cc086485a962a76d651487a9361df8d5f7cb: Status 404 returned error can't find the container with id 5ba1374ccb1257aed3acde11ab13cc086485a962a76d651487a9361df8d5f7cb Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.947112 4747 generic.go:334] "Generic (PLEG): container finished" podID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerID="285ddf6116465e20b0c7bd82ce6e77ae914f4b65dab5b80da0e3e91378d0ba43" exitCode=0 Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.947139 4747 generic.go:334] "Generic (PLEG): container finished" podID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerID="ef1340dd16291d83a5c7396f616cac259a1936b1c01156081d5548c42e5ff21e" exitCode=0 Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.947190 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerDied","Data":"285ddf6116465e20b0c7bd82ce6e77ae914f4b65dab5b80da0e3e91378d0ba43"} Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.947235 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerDied","Data":"ef1340dd16291d83a5c7396f616cac259a1936b1c01156081d5548c42e5ff21e"} Dec 05 22:28:52 crc kubenswrapper[4747]: I1205 22:28:52.948877 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerStarted","Data":"5ba1374ccb1257aed3acde11ab13cc086485a962a76d651487a9361df8d5f7cb"} Dec 05 22:28:53 crc kubenswrapper[4747]: I1205 22:28:53.859958 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c94cb0-78e0-4ae9-8cef-46ca05644ca1" path="/var/lib/kubelet/pods/d8c94cb0-78e0-4ae9-8cef-46ca05644ca1/volumes" Dec 05 22:28:53 crc kubenswrapper[4747]: I1205 22:28:53.959781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerStarted","Data":"dd6314d9517deddf7d71af999e2495692da40e001959986b81b3f015204406fe"} Dec 05 22:28:54 crc kubenswrapper[4747]: I1205 22:28:54.970344 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerStarted","Data":"71f3bdd72d23dad50170dd572c1782b90f030dfae3837d1bce41b732810ac219"} Dec 05 22:28:54 crc kubenswrapper[4747]: I1205 22:28:54.971495 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerStarted","Data":"4b4dc86680826ff0d34c3ade7badd6745fb8b15ecb715300df999b0a48e186c7"} Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.089937 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.090149 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" containerName="kube-state-metrics" containerID="cri-o://2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a" gracePeriod=30 Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.641285 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.732636 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll2pp\" (UniqueName: \"kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp\") pod \"c2cc595b-3fc7-4d0b-9faa-399d45d73efc\" (UID: \"c2cc595b-3fc7-4d0b-9faa-399d45d73efc\") " Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.751887 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp" (OuterVolumeSpecName: "kube-api-access-ll2pp") pod "c2cc595b-3fc7-4d0b-9faa-399d45d73efc" (UID: "c2cc595b-3fc7-4d0b-9faa-399d45d73efc"). InnerVolumeSpecName "kube-api-access-ll2pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.835568 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll2pp\" (UniqueName: \"kubernetes.io/projected/c2cc595b-3fc7-4d0b-9faa-399d45d73efc-kube-api-access-ll2pp\") on node \"crc\" DevicePath \"\"" Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.981027 4747 generic.go:334] "Generic (PLEG): container finished" podID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" containerID="2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a" exitCode=2 Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.981118 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2cc595b-3fc7-4d0b-9faa-399d45d73efc","Type":"ContainerDied","Data":"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a"} Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.981144 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2cc595b-3fc7-4d0b-9faa-399d45d73efc","Type":"ContainerDied","Data":"63d8875157247f63774b7863338001e6cf89025e02b948d94090f2b73499536b"} Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.981177 4747 scope.go:117] "RemoveContainer" containerID="2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a" Dec 05 22:28:55 crc kubenswrapper[4747]: I1205 22:28:55.981562 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.044936 4747 scope.go:117] "RemoveContainer" containerID="2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a" Dec 05 22:28:56 crc kubenswrapper[4747]: E1205 22:28:56.045973 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a\": container with ID starting with 2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a not found: ID does not exist" containerID="2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.046025 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a"} err="failed to get container status \"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a\": rpc error: code = NotFound desc = could not find container \"2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a\": container with ID starting with 2bf7fc491e68005617f8aa12f650d4a9e207d0e2e187f8682adf44a3e729a61a not found: ID does not exist" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.051056 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.067526 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.085382 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:56 crc kubenswrapper[4747]: E1205 22:28:56.119956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" containerName="kube-state-metrics" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.120221 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" containerName="kube-state-metrics" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.120976 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" containerName="kube-state-metrics" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.122054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.124272 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.124861 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.158381 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.247812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62wfp\" (UniqueName: \"kubernetes.io/projected/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-api-access-62wfp\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.247868 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.247898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.248020 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.349820 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62wfp\" (UniqueName: \"kubernetes.io/projected/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-api-access-62wfp\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.349868 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.349894 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.349973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.353968 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.354152 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.360134 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.366231 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62wfp\" (UniqueName: \"kubernetes.io/projected/1bb68bb7-3d25-45fd-bed9-8abf52f38a1f-kube-api-access-62wfp\") pod \"kube-state-metrics-0\" (UID: \"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f\") " pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.452251 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.942075 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 22:28:56 crc kubenswrapper[4747]: I1205 22:28:56.997973 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerStarted","Data":"8a8372ce7923a101a7752ab696f027cd812edfc6613b21d29eaf0dfc285a0554"} Dec 05 22:28:57 crc kubenswrapper[4747]: I1205 22:28:57.000989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f","Type":"ContainerStarted","Data":"ebbb569f2d6ba6f92f010abb6024489555dbc6f29559f4a4bb1adc3c51c32a77"} Dec 05 22:28:57 crc kubenswrapper[4747]: I1205 22:28:57.027194 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.979711472 podStartE2EDuration="6.0271781s" podCreationTimestamp="2025-12-05 22:28:51 +0000 UTC" firstStartedPulling="2025-12-05 22:28:52.909795314 +0000 UTC m=+6403.377102802" lastFinishedPulling="2025-12-05 22:28:55.957261942 +0000 UTC m=+6406.424569430" observedRunningTime="2025-12-05 22:28:57.022941745 +0000 UTC m=+6407.490249243" watchObservedRunningTime="2025-12-05 22:28:57.0271781 +0000 UTC m=+6407.494485598" Dec 05 22:28:57 crc kubenswrapper[4747]: I1205 22:28:57.107489 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:28:57 crc kubenswrapper[4747]: I1205 22:28:57.854752 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cc595b-3fc7-4d0b-9faa-399d45d73efc" path="/var/lib/kubelet/pods/c2cc595b-3fc7-4d0b-9faa-399d45d73efc/volumes" Dec 05 22:28:58 crc kubenswrapper[4747]: I1205 22:28:58.010992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bb68bb7-3d25-45fd-bed9-8abf52f38a1f","Type":"ContainerStarted","Data":"70a1482b9dfa3a6d691d5661d55e3d091c4eae56fd2503bcc85a8f441b34efba"} Dec 05 22:28:58 crc kubenswrapper[4747]: I1205 22:28:58.012279 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 22:28:58 crc kubenswrapper[4747]: I1205 22:28:58.032411 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.641994408 podStartE2EDuration="2.032392512s" podCreationTimestamp="2025-12-05 22:28:56 +0000 UTC" firstStartedPulling="2025-12-05 22:28:56.95293361 +0000 UTC m=+6407.420241098" lastFinishedPulling="2025-12-05 22:28:57.343331724 +0000 UTC m=+6407.810639202" observedRunningTime="2025-12-05 22:28:58.026918927 +0000 UTC m=+6408.494226405" watchObservedRunningTime="2025-12-05 22:28:58.032392512 +0000 UTC m=+6408.499700000" Dec 05 22:28:59 crc kubenswrapper[4747]: I1205 22:28:59.018661 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 22:28:59 crc kubenswrapper[4747]: I1205 22:28:59.019141 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-central-agent" containerID="cri-o://dd6314d9517deddf7d71af999e2495692da40e001959986b81b3f015204406fe" gracePeriod=30 Dec 05 22:28:59 crc kubenswrapper[4747]: I1205 22:28:59.019216 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="sg-core" containerID="cri-o://71f3bdd72d23dad50170dd572c1782b90f030dfae3837d1bce41b732810ac219" gracePeriod=30 Dec 05 22:28:59 crc kubenswrapper[4747]: I1205 22:28:59.019300 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-notification-agent" containerID="cri-o://4b4dc86680826ff0d34c3ade7badd6745fb8b15ecb715300df999b0a48e186c7" gracePeriod=30 Dec 05 22:28:59 crc kubenswrapper[4747]: I1205 22:28:59.019319 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="proxy-httpd" containerID="cri-o://8a8372ce7923a101a7752ab696f027cd812edfc6613b21d29eaf0dfc285a0554" gracePeriod=30 Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.033914 4747 generic.go:334] "Generic (PLEG): container finished" podID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerID="8a8372ce7923a101a7752ab696f027cd812edfc6613b21d29eaf0dfc285a0554" exitCode=0 Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.033961 4747 generic.go:334] "Generic (PLEG): container finished" podID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerID="71f3bdd72d23dad50170dd572c1782b90f030dfae3837d1bce41b732810ac219" exitCode=2 Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.033978 4747 generic.go:334] "Generic (PLEG): container finished" podID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerID="4b4dc86680826ff0d34c3ade7badd6745fb8b15ecb715300df999b0a48e186c7" exitCode=0 Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.033992 4747 generic.go:334] "Generic (PLEG): container finished" podID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerID="dd6314d9517deddf7d71af999e2495692da40e001959986b81b3f015204406fe" exitCode=0 Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.035292 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerDied","Data":"8a8372ce7923a101a7752ab696f027cd812edfc6613b21d29eaf0dfc285a0554"} Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.035345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerDied","Data":"71f3bdd72d23dad50170dd572c1782b90f030dfae3837d1bce41b732810ac219"} Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.035369 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerDied","Data":"4b4dc86680826ff0d34c3ade7badd6745fb8b15ecb715300df999b0a48e186c7"} Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.035389 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerDied","Data":"dd6314d9517deddf7d71af999e2495692da40e001959986b81b3f015204406fe"} Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.122423 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231153 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231290 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231369 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231520 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231564 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.231651 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trrx2\" (UniqueName: \"kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2\") pod \"1aa00771-6f36-432d-b2a7-566f34c88db1\" (UID: \"1aa00771-6f36-432d-b2a7-566f34c88db1\") " Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.232683 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.232894 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.236848 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2" (OuterVolumeSpecName: "kube-api-access-trrx2") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "kube-api-access-trrx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.237388 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts" (OuterVolumeSpecName: "scripts") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.259890 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.315773 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335386 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335423 4747 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335436 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335451 4747 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335462 4747 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa00771-6f36-432d-b2a7-566f34c88db1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.335475 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trrx2\" (UniqueName: \"kubernetes.io/projected/1aa00771-6f36-432d-b2a7-566f34c88db1-kube-api-access-trrx2\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.343389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data" (OuterVolumeSpecName: "config-data") pod "1aa00771-6f36-432d-b2a7-566f34c88db1" (UID: "1aa00771-6f36-432d-b2a7-566f34c88db1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:00 crc kubenswrapper[4747]: I1205 22:29:00.437556 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa00771-6f36-432d-b2a7-566f34c88db1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.048441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa00771-6f36-432d-b2a7-566f34c88db1","Type":"ContainerDied","Data":"5ba1374ccb1257aed3acde11ab13cc086485a962a76d651487a9361df8d5f7cb"} Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.048489 4747 scope.go:117] "RemoveContainer" containerID="8a8372ce7923a101a7752ab696f027cd812edfc6613b21d29eaf0dfc285a0554" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.048535 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.071697 4747 scope.go:117] "RemoveContainer" containerID="71f3bdd72d23dad50170dd572c1782b90f030dfae3837d1bce41b732810ac219" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.094478 4747 scope.go:117] "RemoveContainer" containerID="4b4dc86680826ff0d34c3ade7badd6745fb8b15ecb715300df999b0a48e186c7" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.096958 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.107888 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.120732 4747 scope.go:117] "RemoveContainer" containerID="dd6314d9517deddf7d71af999e2495692da40e001959986b81b3f015204406fe" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.129537 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:29:01 crc kubenswrapper[4747]: E1205 22:29:01.130121 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="sg-core" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130139 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="sg-core" Dec 05 22:29:01 crc kubenswrapper[4747]: E1205 22:29:01.130158 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="proxy-httpd" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130166 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="proxy-httpd" Dec 05 22:29:01 crc kubenswrapper[4747]: E1205 22:29:01.130191 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-notification-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130197 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-notification-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: E1205 22:29:01.130210 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-central-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130216 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-central-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130393 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-notification-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130410 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="ceilometer-central-agent" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130421 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="proxy-httpd" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.130430 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" containerName="sg-core" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.132416 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.137482 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.137813 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.137934 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.143030 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262007 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262131 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6sg\" (UniqueName: \"kubernetes.io/projected/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-kube-api-access-vj6sg\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262160 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-run-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262185 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-log-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262228 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262259 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262281 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-scripts\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.262325 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-config-data\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364470 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6sg\" (UniqueName: \"kubernetes.io/projected/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-kube-api-access-vj6sg\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364625 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-run-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364688 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-log-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364845 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.364907 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-scripts\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.365044 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-config-data\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.365175 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.371096 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.373802 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.374391 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-run-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.374795 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-log-httpd\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.375196 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.380292 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-config-data\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.387549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-scripts\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.391785 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6sg\" (UniqueName: \"kubernetes.io/projected/412f1967-e1ed-44a1-84b4-2cdd04e9cb10-kube-api-access-vj6sg\") pod \"ceilometer-0\" (UID: \"412f1967-e1ed-44a1-84b4-2cdd04e9cb10\") " pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.467206 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.840194 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:29:01 crc kubenswrapper[4747]: E1205 22:29:01.840775 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.868224 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa00771-6f36-432d-b2a7-566f34c88db1" path="/var/lib/kubelet/pods/1aa00771-6f36-432d-b2a7-566f34c88db1/volumes" Dec 05 22:29:01 crc kubenswrapper[4747]: I1205 22:29:01.985906 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 22:29:01 crc kubenswrapper[4747]: W1205 22:29:01.987038 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412f1967_e1ed_44a1_84b4_2cdd04e9cb10.slice/crio-b8c0602034a66202ca81d31e40911c0cbef932576080f417f51b541189f7d139 WatchSource:0}: Error finding container b8c0602034a66202ca81d31e40911c0cbef932576080f417f51b541189f7d139: Status 404 returned error can't find the container with id b8c0602034a66202ca81d31e40911c0cbef932576080f417f51b541189f7d139 Dec 05 22:29:02 crc kubenswrapper[4747]: I1205 22:29:02.060260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412f1967-e1ed-44a1-84b4-2cdd04e9cb10","Type":"ContainerStarted","Data":"b8c0602034a66202ca81d31e40911c0cbef932576080f417f51b541189f7d139"} Dec 05 22:29:03 crc kubenswrapper[4747]: I1205 22:29:03.077270 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412f1967-e1ed-44a1-84b4-2cdd04e9cb10","Type":"ContainerStarted","Data":"1692f37fd5743defeb06b47e37b679ae599a560d1ee6c5e2f9d2641e2981151b"} Dec 05 22:29:04 crc kubenswrapper[4747]: I1205 22:29:04.090019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412f1967-e1ed-44a1-84b4-2cdd04e9cb10","Type":"ContainerStarted","Data":"9cf4aa2013e1460328f3903579050538f363a135a062b9e00a6453ffe23453d5"} Dec 05 22:29:04 crc kubenswrapper[4747]: I1205 22:29:04.090560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412f1967-e1ed-44a1-84b4-2cdd04e9cb10","Type":"ContainerStarted","Data":"384ba345de7e9022d2d5b35cdf342340cd7df5a17c091247d7ff07e8622da9c9"} Dec 05 22:29:05 crc kubenswrapper[4747]: I1205 22:29:05.110949 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 22:29:05 crc kubenswrapper[4747]: I1205 22:29:05.144125 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.346016042 podStartE2EDuration="4.14410298s" podCreationTimestamp="2025-12-05 22:29:01 +0000 UTC" firstStartedPulling="2025-12-05 22:29:01.989341652 +0000 UTC m=+6412.456649140" lastFinishedPulling="2025-12-05 22:29:04.78742855 +0000 UTC m=+6415.254736078" observedRunningTime="2025-12-05 22:29:05.134227735 +0000 UTC m=+6415.601535263" watchObservedRunningTime="2025-12-05 22:29:05.14410298 +0000 UTC m=+6415.611410478" Dec 05 22:29:06 crc kubenswrapper[4747]: I1205 22:29:06.125935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412f1967-e1ed-44a1-84b4-2cdd04e9cb10","Type":"ContainerStarted","Data":"9eb0720537505fedc646719badca0407d95da40d8c7c8e818b5b368a007c47c2"} Dec 05 22:29:06 crc kubenswrapper[4747]: I1205 22:29:06.463328 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 22:29:12 crc kubenswrapper[4747]: I1205 22:29:12.840291 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:29:12 crc kubenswrapper[4747]: E1205 22:29:12.841061 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.062985 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-15d6-account-create-update-9wmxq"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.076634 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-31a9-account-create-update-dwhfh"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.092879 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-v48zv"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.105037 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-edba-account-create-update-v69dv"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.116296 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-s8w5s"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.126457 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kffrg"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.135875 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-v48zv"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.146376 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-31a9-account-create-update-dwhfh"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.155702 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-15d6-account-create-update-9wmxq"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.164869 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kffrg"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.173266 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-edba-account-create-update-v69dv"] Dec 05 22:29:16 crc kubenswrapper[4747]: I1205 22:29:16.182119 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-s8w5s"] Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.858362 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24104846-9490-409f-b1be-cb3b84eb2d11" path="/var/lib/kubelet/pods/24104846-9490-409f-b1be-cb3b84eb2d11/volumes" Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.859778 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2a3210-8646-450a-9f16-cb6471ca857f" path="/var/lib/kubelet/pods/6d2a3210-8646-450a-9f16-cb6471ca857f/volumes" Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.861092 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d77f7ec-5822-4fd0-b935-09f0f77ce1ee" path="/var/lib/kubelet/pods/8d77f7ec-5822-4fd0-b935-09f0f77ce1ee/volumes" Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.861918 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf83e06-10f7-44aa-b33b-36c305016e03" path="/var/lib/kubelet/pods/adf83e06-10f7-44aa-b33b-36c305016e03/volumes" Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.863625 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1de861-ab51-437a-a5f7-a6d5454f0360" path="/var/lib/kubelet/pods/bb1de861-ab51-437a-a5f7-a6d5454f0360/volumes" Dec 05 22:29:17 crc kubenswrapper[4747]: I1205 22:29:17.864248 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2309ee7-7430-4873-b34e-0428cb08f00c" path="/var/lib/kubelet/pods/f2309ee7-7430-4873-b34e-0428cb08f00c/volumes" Dec 05 22:29:22 crc kubenswrapper[4747]: I1205 22:29:22.291066 4747 generic.go:334] "Generic (PLEG): container finished" podID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerID="7fb5a982607debcdc7aef3df15135b63ec5ecf2c8e5fcb4207918a8f162c85e1" exitCode=137 Dec 05 22:29:22 crc kubenswrapper[4747]: I1205 22:29:22.291606 4747 generic.go:334] "Generic (PLEG): container finished" podID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerID="1664c855fe070884813f748436aec22a812c5fe17cc81e903b4dbf4af4f00f13" exitCode=137 Dec 05 22:29:22 crc kubenswrapper[4747]: I1205 22:29:22.291141 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerDied","Data":"7fb5a982607debcdc7aef3df15135b63ec5ecf2c8e5fcb4207918a8f162c85e1"} Dec 05 22:29:22 crc kubenswrapper[4747]: I1205 22:29:22.291648 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerDied","Data":"1664c855fe070884813f748436aec22a812c5fe17cc81e903b4dbf4af4f00f13"} Dec 05 22:29:22 crc kubenswrapper[4747]: I1205 22:29:22.939335 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.053899 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpvw\" (UniqueName: \"kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw\") pod \"e253f491-ef2a-497f-8ef8-df5430ceed54\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.053973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts\") pod \"e253f491-ef2a-497f-8ef8-df5430ceed54\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.054009 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data\") pod \"e253f491-ef2a-497f-8ef8-df5430ceed54\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.054209 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle\") pod \"e253f491-ef2a-497f-8ef8-df5430ceed54\" (UID: \"e253f491-ef2a-497f-8ef8-df5430ceed54\") " Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.064343 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts" (OuterVolumeSpecName: "scripts") pod "e253f491-ef2a-497f-8ef8-df5430ceed54" (UID: "e253f491-ef2a-497f-8ef8-df5430ceed54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.068787 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw" (OuterVolumeSpecName: "kube-api-access-2hpvw") pod "e253f491-ef2a-497f-8ef8-df5430ceed54" (UID: "e253f491-ef2a-497f-8ef8-df5430ceed54"). InnerVolumeSpecName "kube-api-access-2hpvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.157624 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpvw\" (UniqueName: \"kubernetes.io/projected/e253f491-ef2a-497f-8ef8-df5430ceed54-kube-api-access-2hpvw\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.157871 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.233972 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e253f491-ef2a-497f-8ef8-df5430ceed54" (UID: "e253f491-ef2a-497f-8ef8-df5430ceed54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.241925 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data" (OuterVolumeSpecName: "config-data") pod "e253f491-ef2a-497f-8ef8-df5430ceed54" (UID: "e253f491-ef2a-497f-8ef8-df5430ceed54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.260008 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.260158 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253f491-ef2a-497f-8ef8-df5430ceed54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.378329 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e253f491-ef2a-497f-8ef8-df5430ceed54","Type":"ContainerDied","Data":"9ee5e93ff0e6e89f7abd0e8ad1b69a14b21c505b50a5c11c65635f0ea45708c8"} Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.378638 4747 scope.go:117] "RemoveContainer" containerID="7fb5a982607debcdc7aef3df15135b63ec5ecf2c8e5fcb4207918a8f162c85e1" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.378758 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.416909 4747 scope.go:117] "RemoveContainer" containerID="1664c855fe070884813f748436aec22a812c5fe17cc81e903b4dbf4af4f00f13" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.544002 4747 scope.go:117] "RemoveContainer" containerID="285ddf6116465e20b0c7bd82ce6e77ae914f4b65dab5b80da0e3e91378d0ba43" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.549048 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.578548 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.579941 4747 scope.go:117] "RemoveContainer" containerID="ef1340dd16291d83a5c7396f616cac259a1936b1c01156081d5548c42e5ff21e" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.602831 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 22:29:23 crc kubenswrapper[4747]: E1205 22:29:23.603253 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-notifier" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603265 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-notifier" Dec 05 22:29:23 crc kubenswrapper[4747]: E1205 22:29:23.603280 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-api" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603285 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-api" Dec 05 22:29:23 crc kubenswrapper[4747]: E1205 22:29:23.603296 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-evaluator" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603302 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-evaluator" Dec 05 22:29:23 crc kubenswrapper[4747]: E1205 22:29:23.603328 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-listener" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603334 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-listener" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603543 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-evaluator" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603552 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-notifier" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603565 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-api" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.603594 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" containerName="aodh-listener" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.605587 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.609228 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.609384 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l284b" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.609522 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.609746 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.609873 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.617473 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 22:29:23 crc kubenswrapper[4747]: E1205 22:29:23.682013 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode253f491_ef2a_497f_8ef8_df5430ceed54.slice/crio-9ee5e93ff0e6e89f7abd0e8ad1b69a14b21c505b50a5c11c65635f0ea45708c8\": RecentStats: unable to find data in memory cache]" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.770873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-scripts\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.770914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-config-data\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.771043 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-public-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.771123 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.771159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555ll\" (UniqueName: \"kubernetes.io/projected/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-kube-api-access-555ll\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.771185 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-internal-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.853385 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e253f491-ef2a-497f-8ef8-df5430ceed54" path="/var/lib/kubelet/pods/e253f491-ef2a-497f-8ef8-df5430ceed54/volumes" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896227 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555ll\" (UniqueName: \"kubernetes.io/projected/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-kube-api-access-555ll\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896451 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-internal-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-scripts\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896578 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-config-data\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-public-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.896725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.900741 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.900845 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-scripts\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.901904 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-config-data\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.903259 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-internal-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.903981 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-public-tls-certs\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:23 crc kubenswrapper[4747]: I1205 22:29:23.912539 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555ll\" (UniqueName: \"kubernetes.io/projected/dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6-kube-api-access-555ll\") pod \"aodh-0\" (UID: \"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6\") " pod="openstack/aodh-0" Dec 05 22:29:24 crc kubenswrapper[4747]: I1205 22:29:24.455036 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 22:29:25 crc kubenswrapper[4747]: I1205 22:29:25.052491 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 22:29:25 crc kubenswrapper[4747]: W1205 22:29:25.071285 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd4b799_52f7_41ea_a1aa_aefcc25bb7b6.slice/crio-dd4edbc9193dbd984a69459123c395750373d2f3294bbf1ca30fa4a15b6e4c42 WatchSource:0}: Error finding container dd4edbc9193dbd984a69459123c395750373d2f3294bbf1ca30fa4a15b6e4c42: Status 404 returned error can't find the container with id dd4edbc9193dbd984a69459123c395750373d2f3294bbf1ca30fa4a15b6e4c42 Dec 05 22:29:25 crc kubenswrapper[4747]: I1205 22:29:25.518574 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6","Type":"ContainerStarted","Data":"dd4edbc9193dbd984a69459123c395750373d2f3294bbf1ca30fa4a15b6e4c42"} Dec 05 22:29:25 crc kubenswrapper[4747]: I1205 22:29:25.841119 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:29:25 crc kubenswrapper[4747]: E1205 22:29:25.841658 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:29:26 crc kubenswrapper[4747]: I1205 22:29:26.068467 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-trx7x"] Dec 05 22:29:26 crc kubenswrapper[4747]: I1205 22:29:26.092435 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-trx7x"] Dec 05 22:29:26 crc kubenswrapper[4747]: I1205 22:29:26.531276 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6","Type":"ContainerStarted","Data":"7d613e67b00ec1d81ce5778ec61901bd21033b105d7a5152603480c740f7e48f"} Dec 05 22:29:27 crc kubenswrapper[4747]: I1205 22:29:27.543962 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6","Type":"ContainerStarted","Data":"53407d59770ea52be8df52c7924e242abb5a941e4c84989615bedf42852ce633"} Dec 05 22:29:27 crc kubenswrapper[4747]: I1205 22:29:27.544298 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6","Type":"ContainerStarted","Data":"a7c12bc80ec8543f0d7233a8a0e295b97c08160eaf47f202e8d724e36dbac279"} Dec 05 22:29:27 crc kubenswrapper[4747]: I1205 22:29:27.851752 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0726b010-6142-410a-a353-d9c89ea8f724" path="/var/lib/kubelet/pods/0726b010-6142-410a-a353-d9c89ea8f724/volumes" Dec 05 22:29:28 crc kubenswrapper[4747]: I1205 22:29:28.557532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6","Type":"ContainerStarted","Data":"e20a9a2491c741f7596018265097b808845206844b084236651d4b79e9249cca"} Dec 05 22:29:28 crc kubenswrapper[4747]: I1205 22:29:28.583331 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.000705133 podStartE2EDuration="5.583312911s" podCreationTimestamp="2025-12-05 22:29:23 +0000 UTC" firstStartedPulling="2025-12-05 22:29:25.076327783 +0000 UTC m=+6435.543635291" lastFinishedPulling="2025-12-05 22:29:27.658935591 +0000 UTC m=+6438.126243069" observedRunningTime="2025-12-05 22:29:28.575119608 +0000 UTC m=+6439.042427086" watchObservedRunningTime="2025-12-05 22:29:28.583312911 +0000 UTC m=+6439.050620399" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.122055 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.123942 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.126371 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.137478 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.200549 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.201070 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.201234 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2jt\" (UniqueName: \"kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.201343 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.201368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.201580 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303250 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2jt\" (UniqueName: \"kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303309 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303328 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303384 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303403 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.303504 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.304516 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.304778 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.304870 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.304980 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.305506 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.330831 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2jt\" (UniqueName: \"kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt\") pod \"dnsmasq-dns-6f7cd9cbd5-9vqfv\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.441098 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:29 crc kubenswrapper[4747]: I1205 22:29:29.936087 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:29 crc kubenswrapper[4747]: W1205 22:29:29.938908 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2b4415_019c_4161_ae84_fbba3c4dc2ad.slice/crio-a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9 WatchSource:0}: Error finding container a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9: Status 404 returned error can't find the container with id a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9 Dec 05 22:29:30 crc kubenswrapper[4747]: I1205 22:29:30.586437 4747 generic.go:334] "Generic (PLEG): container finished" podID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerID="9aa3ed7ab8901d887337079052ccf492610dbd63cb34d934e369beb2d96763e0" exitCode=0 Dec 05 22:29:30 crc kubenswrapper[4747]: I1205 22:29:30.588379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" event={"ID":"cf2b4415-019c-4161-ae84-fbba3c4dc2ad","Type":"ContainerDied","Data":"9aa3ed7ab8901d887337079052ccf492610dbd63cb34d934e369beb2d96763e0"} Dec 05 22:29:30 crc kubenswrapper[4747]: I1205 22:29:30.588418 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" event={"ID":"cf2b4415-019c-4161-ae84-fbba3c4dc2ad","Type":"ContainerStarted","Data":"a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9"} Dec 05 22:29:31 crc kubenswrapper[4747]: I1205 22:29:31.477340 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 22:29:31 crc kubenswrapper[4747]: I1205 22:29:31.599482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" event={"ID":"cf2b4415-019c-4161-ae84-fbba3c4dc2ad","Type":"ContainerStarted","Data":"45fc492ccb6021fa147184f4a4c3fa61aa18b3237f009988927b1152235fcce7"} Dec 05 22:29:31 crc kubenswrapper[4747]: I1205 22:29:31.599892 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:31 crc kubenswrapper[4747]: I1205 22:29:31.626724 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" podStartSLOduration=2.6267044090000002 podStartE2EDuration="2.626704409s" podCreationTimestamp="2025-12-05 22:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:29:31.614190128 +0000 UTC m=+6442.081497626" watchObservedRunningTime="2025-12-05 22:29:31.626704409 +0000 UTC m=+6442.094011897" Dec 05 22:29:35 crc kubenswrapper[4747]: I1205 22:29:35.760568 4747 scope.go:117] "RemoveContainer" containerID="0b7baed7145b33c3ef0a520cea9743fded9c1145020746ee40fcdd3cf12db16c" Dec 05 22:29:35 crc kubenswrapper[4747]: I1205 22:29:35.812991 4747 scope.go:117] "RemoveContainer" containerID="d4325bffe5e4c1c2f3c0f79ef3441e66c5359506f83cb0cca4da869ebcb0621a" Dec 05 22:29:35 crc kubenswrapper[4747]: I1205 22:29:35.888115 4747 scope.go:117] "RemoveContainer" containerID="367a0c30a026f2df077f5996790acc7826c2158a42090f03d5d853e2af58d572" Dec 05 22:29:35 crc kubenswrapper[4747]: I1205 22:29:35.934517 4747 scope.go:117] "RemoveContainer" containerID="186bf1dbc0ef085f067b7f9bd83bab2c3ecaaee73075b570ee00ffc38865ad6c" Dec 05 22:29:35 crc kubenswrapper[4747]: I1205 22:29:35.984456 4747 scope.go:117] "RemoveContainer" containerID="d9c78024e04894812a9542f17ff9904234e2f413595c7030ffdfdaaf47107b9e" Dec 05 22:29:36 crc kubenswrapper[4747]: I1205 22:29:36.031009 4747 scope.go:117] "RemoveContainer" containerID="4b2064791b36d4533cf9e76e370f1a3b342bfe727189bb06b13926554f127987" Dec 05 22:29:36 crc kubenswrapper[4747]: I1205 22:29:36.093548 4747 scope.go:117] "RemoveContainer" containerID="ae7f4b9e9e876686b8ae22e6ee55cb869b9a39e8ced944c20a6a58eb168c7d14" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.443247 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.555543 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.556927 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="dnsmasq-dns" containerID="cri-o://ab5f9a33d9b202593398e59d04d27bb004433b2c9aa1584b8d9e79fbd713b26b" gracePeriod=10 Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.724163 4747 generic.go:334] "Generic (PLEG): container finished" podID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerID="ab5f9a33d9b202593398e59d04d27bb004433b2c9aa1584b8d9e79fbd713b26b" exitCode=0 Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.724364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" event={"ID":"b514da94-b08d-4ba3-b22e-f8c5e3729bd1","Type":"ContainerDied","Data":"ab5f9a33d9b202593398e59d04d27bb004433b2c9aa1584b8d9e79fbd713b26b"} Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.766455 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d69b66569-m2hjf"] Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.769054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.780402 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d69b66569-m2hjf"] Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.848601 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849414 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: E1205 22:29:39.849449 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849474 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-openstack-cell1\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849517 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-config\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849636 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8gc\" (UniqueName: \"kubernetes.io/projected/8af80a5a-f044-4431-8c68-aba1507de5d1-kube-api-access-lg8gc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849686 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.849873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-dns-svc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952004 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-dns-svc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952084 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952107 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-openstack-cell1\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952147 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-config\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8gc\" (UniqueName: \"kubernetes.io/projected/8af80a5a-f044-4431-8c68-aba1507de5d1-kube-api-access-lg8gc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.952235 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.953140 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-openstack-cell1\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.954253 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-config\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.954459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-dns-svc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.954787 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.955233 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af80a5a-f044-4431-8c68-aba1507de5d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:39 crc kubenswrapper[4747]: I1205 22:29:39.974289 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8gc\" (UniqueName: \"kubernetes.io/projected/8af80a5a-f044-4431-8c68-aba1507de5d1-kube-api-access-lg8gc\") pod \"dnsmasq-dns-5d69b66569-m2hjf\" (UID: \"8af80a5a-f044-4431-8c68-aba1507de5d1\") " pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.156985 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.271750 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.365448 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config\") pod \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.365518 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqjpj\" (UniqueName: \"kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj\") pod \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.365593 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb\") pod \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.365694 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb\") pod \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.365715 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc\") pod \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\" (UID: \"b514da94-b08d-4ba3-b22e-f8c5e3729bd1\") " Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.413954 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj" (OuterVolumeSpecName: "kube-api-access-mqjpj") pod "b514da94-b08d-4ba3-b22e-f8c5e3729bd1" (UID: "b514da94-b08d-4ba3-b22e-f8c5e3729bd1"). InnerVolumeSpecName "kube-api-access-mqjpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.445653 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b514da94-b08d-4ba3-b22e-f8c5e3729bd1" (UID: "b514da94-b08d-4ba3-b22e-f8c5e3729bd1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.457131 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config" (OuterVolumeSpecName: "config") pod "b514da94-b08d-4ba3-b22e-f8c5e3729bd1" (UID: "b514da94-b08d-4ba3-b22e-f8c5e3729bd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.459032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b514da94-b08d-4ba3-b22e-f8c5e3729bd1" (UID: "b514da94-b08d-4ba3-b22e-f8c5e3729bd1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.468864 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.468908 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.468920 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.468931 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqjpj\" (UniqueName: \"kubernetes.io/projected/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-kube-api-access-mqjpj\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.482265 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b514da94-b08d-4ba3-b22e-f8c5e3729bd1" (UID: "b514da94-b08d-4ba3-b22e-f8c5e3729bd1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.571349 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b514da94-b08d-4ba3-b22e-f8c5e3729bd1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.628918 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d69b66569-m2hjf"] Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.741011 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" event={"ID":"b514da94-b08d-4ba3-b22e-f8c5e3729bd1","Type":"ContainerDied","Data":"5897077cfbcacc815c50b0461a683bfa43663082ca2059d460127ca878ed2007"} Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.741053 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4b5ff85f-vnc8k" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.741063 4747 scope.go:117] "RemoveContainer" containerID="ab5f9a33d9b202593398e59d04d27bb004433b2c9aa1584b8d9e79fbd713b26b" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.745687 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" event={"ID":"8af80a5a-f044-4431-8c68-aba1507de5d1","Type":"ContainerStarted","Data":"744be1171735823f331c4ff949d0bd5e767278a67c3299488ebd3423683dde94"} Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.897168 4747 scope.go:117] "RemoveContainer" containerID="75e6d73cf1bcb20b598c2bb7fa1119dba57c4758d89eb1a7c8dcf99b203b9140" Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.923661 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:29:40 crc kubenswrapper[4747]: I1205 22:29:40.939898 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f4b5ff85f-vnc8k"] Dec 05 22:29:41 crc kubenswrapper[4747]: I1205 22:29:41.752901 4747 generic.go:334] "Generic (PLEG): container finished" podID="8af80a5a-f044-4431-8c68-aba1507de5d1" containerID="f739770b242417d1212272f43dc1311a637d97257e3548442a5618a782de15a6" exitCode=0 Dec 05 22:29:41 crc kubenswrapper[4747]: I1205 22:29:41.753006 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" event={"ID":"8af80a5a-f044-4431-8c68-aba1507de5d1","Type":"ContainerDied","Data":"f739770b242417d1212272f43dc1311a637d97257e3548442a5618a782de15a6"} Dec 05 22:29:41 crc kubenswrapper[4747]: I1205 22:29:41.867691 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" path="/var/lib/kubelet/pods/b514da94-b08d-4ba3-b22e-f8c5e3729bd1/volumes" Dec 05 22:29:42 crc kubenswrapper[4747]: I1205 22:29:42.770993 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" event={"ID":"8af80a5a-f044-4431-8c68-aba1507de5d1","Type":"ContainerStarted","Data":"27583abf633f07726e24054d0bde479152c17641d5e0cb8ce0c79d246cfbbcfa"} Dec 05 22:29:42 crc kubenswrapper[4747]: I1205 22:29:42.771422 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.044642 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" podStartSLOduration=6.04461874 podStartE2EDuration="6.04461874s" podCreationTimestamp="2025-12-05 22:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:29:42.797920068 +0000 UTC m=+6453.265227566" watchObservedRunningTime="2025-12-05 22:29:45.04461874 +0000 UTC m=+6455.511926228" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.049115 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85bb5"] Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.062904 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85bb5"] Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.501882 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7"] Dec 05 22:29:45 crc kubenswrapper[4747]: E1205 22:29:45.502355 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="dnsmasq-dns" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.502374 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="dnsmasq-dns" Dec 05 22:29:45 crc kubenswrapper[4747]: E1205 22:29:45.502402 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="init" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.502409 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="init" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.502716 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b514da94-b08d-4ba3-b22e-f8c5e3729bd1" containerName="dnsmasq-dns" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.503446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.505797 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.506035 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.506278 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.507345 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.519021 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7"] Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.680417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.680698 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.680851 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.681030 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76gw\" (UniqueName: \"kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.782862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.782965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.783050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.783111 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b76gw\" (UniqueName: \"kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.791412 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.791626 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.795338 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.814159 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76gw\" (UniqueName: \"kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.822853 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:29:45 crc kubenswrapper[4747]: I1205 22:29:45.859205 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a" path="/var/lib/kubelet/pods/75a7287d-6c76-4bf6-b8b6-e2f9b88ccb2a/volumes" Dec 05 22:29:46 crc kubenswrapper[4747]: I1205 22:29:46.046309 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-k8mwz"] Dec 05 22:29:46 crc kubenswrapper[4747]: I1205 22:29:46.054423 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-k8mwz"] Dec 05 22:29:46 crc kubenswrapper[4747]: I1205 22:29:46.587701 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7"] Dec 05 22:29:46 crc kubenswrapper[4747]: I1205 22:29:46.834163 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" event={"ID":"5958f892-c6a2-4203-b7ef-f20a17c75771","Type":"ContainerStarted","Data":"e4aa33e52b21fde8690dafbfeb185e6ce9ca7f8ae1a87b0696a94e85d5d420b7"} Dec 05 22:29:47 crc kubenswrapper[4747]: I1205 22:29:47.856034 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8688ad0-be6e-4597-9557-5c9405b2c2a8" path="/var/lib/kubelet/pods/d8688ad0-be6e-4597-9557-5c9405b2c2a8/volumes" Dec 05 22:29:50 crc kubenswrapper[4747]: I1205 22:29:50.161249 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d69b66569-m2hjf" Dec 05 22:29:50 crc kubenswrapper[4747]: I1205 22:29:50.304200 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:50 crc kubenswrapper[4747]: I1205 22:29:50.304442 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="dnsmasq-dns" containerID="cri-o://45fc492ccb6021fa147184f4a4c3fa61aa18b3237f009988927b1152235fcce7" gracePeriod=10 Dec 05 22:29:50 crc kubenswrapper[4747]: I1205 22:29:50.883044 4747 generic.go:334] "Generic (PLEG): container finished" podID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerID="45fc492ccb6021fa147184f4a4c3fa61aa18b3237f009988927b1152235fcce7" exitCode=0 Dec 05 22:29:50 crc kubenswrapper[4747]: I1205 22:29:50.883140 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" event={"ID":"cf2b4415-019c-4161-ae84-fbba3c4dc2ad","Type":"ContainerDied","Data":"45fc492ccb6021fa147184f4a4c3fa61aa18b3237f009988927b1152235fcce7"} Dec 05 22:29:53 crc kubenswrapper[4747]: I1205 22:29:53.847634 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:29:53 crc kubenswrapper[4747]: E1205 22:29:53.848449 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:29:56 crc kubenswrapper[4747]: I1205 22:29:56.957139 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" event={"ID":"cf2b4415-019c-4161-ae84-fbba3c4dc2ad","Type":"ContainerDied","Data":"a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9"} Dec 05 22:29:56 crc kubenswrapper[4747]: I1205 22:29:56.957898 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a061e37c34f6726fb1d7a2e37e0f72ffc0cd12d5517c1fe0c464950cd3c4ffa9" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.070551 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135274 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135365 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135424 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.135688 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2jt\" (UniqueName: \"kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt\") pod \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\" (UID: \"cf2b4415-019c-4161-ae84-fbba3c4dc2ad\") " Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.140545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt" (OuterVolumeSpecName: "kube-api-access-xh2jt") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "kube-api-access-xh2jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.200072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.207114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.211497 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config" (OuterVolumeSpecName: "config") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.218526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.219494 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf2b4415-019c-4161-ae84-fbba3c4dc2ad" (UID: "cf2b4415-019c-4161-ae84-fbba3c4dc2ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245158 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2jt\" (UniqueName: \"kubernetes.io/projected/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-kube-api-access-xh2jt\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245190 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-config\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245200 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245208 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245217 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.245226 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf2b4415-019c-4161-ae84-fbba3c4dc2ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.968561 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" event={"ID":"5958f892-c6a2-4203-b7ef-f20a17c75771","Type":"ContainerStarted","Data":"8c852c6d49feae97a5b6e960eac62106ea499f0c335cdad775d651401253c845"} Dec 05 22:29:57 crc kubenswrapper[4747]: I1205 22:29:57.968627 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" Dec 05 22:29:58 crc kubenswrapper[4747]: I1205 22:29:58.002462 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" podStartSLOduration=2.6965677770000003 podStartE2EDuration="13.002442209s" podCreationTimestamp="2025-12-05 22:29:45 +0000 UTC" firstStartedPulling="2025-12-05 22:29:46.593831366 +0000 UTC m=+6457.061138854" lastFinishedPulling="2025-12-05 22:29:56.899705798 +0000 UTC m=+6467.367013286" observedRunningTime="2025-12-05 22:29:57.997477196 +0000 UTC m=+6468.464784694" watchObservedRunningTime="2025-12-05 22:29:58.002442209 +0000 UTC m=+6468.469749707" Dec 05 22:29:58 crc kubenswrapper[4747]: I1205 22:29:58.026152 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:58 crc kubenswrapper[4747]: I1205 22:29:58.036096 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv"] Dec 05 22:29:59 crc kubenswrapper[4747]: I1205 22:29:59.441897 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f7cd9cbd5-9vqfv" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.155:5353: i/o timeout" Dec 05 22:29:59 crc kubenswrapper[4747]: I1205 22:29:59.870834 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" path="/var/lib/kubelet/pods/cf2b4415-019c-4161-ae84-fbba3c4dc2ad/volumes" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.148704 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd"] Dec 05 22:30:00 crc kubenswrapper[4747]: E1205 22:30:00.149240 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="dnsmasq-dns" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.149256 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="dnsmasq-dns" Dec 05 22:30:00 crc kubenswrapper[4747]: E1205 22:30:00.149304 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="init" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.149332 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="init" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.149684 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2b4415-019c-4161-ae84-fbba3c4dc2ad" containerName="dnsmasq-dns" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.150686 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.153099 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.153419 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.165962 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd"] Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.209287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.209403 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.209444 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.311258 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.311354 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.311524 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.312549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.318989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.334732 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr\") pod \"collect-profiles-29416230-2kqqd\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:00 crc kubenswrapper[4747]: I1205 22:30:00.482574 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:01 crc kubenswrapper[4747]: I1205 22:30:01.007470 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd"] Dec 05 22:30:02 crc kubenswrapper[4747]: I1205 22:30:02.012019 4747 generic.go:334] "Generic (PLEG): container finished" podID="3033d749-826d-47fc-ace8-c06f765415a7" containerID="defc4b565a59578fd9e755a9b56a1fae2fc79e6c9f2cd377d7a9c44523202eff" exitCode=0 Dec 05 22:30:02 crc kubenswrapper[4747]: I1205 22:30:02.012347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" event={"ID":"3033d749-826d-47fc-ace8-c06f765415a7","Type":"ContainerDied","Data":"defc4b565a59578fd9e755a9b56a1fae2fc79e6c9f2cd377d7a9c44523202eff"} Dec 05 22:30:02 crc kubenswrapper[4747]: I1205 22:30:02.012375 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" event={"ID":"3033d749-826d-47fc-ace8-c06f765415a7","Type":"ContainerStarted","Data":"53d840152ce71381b5bb953c22299b758cc5db4fda84c03ec2b0ddeab7be5ec4"} Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.050270 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5f8t4"] Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.062744 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5f8t4"] Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.506007 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.583483 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr\") pod \"3033d749-826d-47fc-ace8-c06f765415a7\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.583652 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume\") pod \"3033d749-826d-47fc-ace8-c06f765415a7\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.583727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume\") pod \"3033d749-826d-47fc-ace8-c06f765415a7\" (UID: \"3033d749-826d-47fc-ace8-c06f765415a7\") " Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.584451 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "3033d749-826d-47fc-ace8-c06f765415a7" (UID: "3033d749-826d-47fc-ace8-c06f765415a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.589272 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3033d749-826d-47fc-ace8-c06f765415a7" (UID: "3033d749-826d-47fc-ace8-c06f765415a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.589881 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr" (OuterVolumeSpecName: "kube-api-access-w9fgr") pod "3033d749-826d-47fc-ace8-c06f765415a7" (UID: "3033d749-826d-47fc-ace8-c06f765415a7"). InnerVolumeSpecName "kube-api-access-w9fgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.686400 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3033d749-826d-47fc-ace8-c06f765415a7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.686438 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3033d749-826d-47fc-ace8-c06f765415a7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.686448 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/3033d749-826d-47fc-ace8-c06f765415a7-kube-api-access-w9fgr\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:03 crc kubenswrapper[4747]: I1205 22:30:03.851328 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b55883-d37f-4373-9dd6-bed4ff2b322f" path="/var/lib/kubelet/pods/a3b55883-d37f-4373-9dd6-bed4ff2b322f/volumes" Dec 05 22:30:04 crc kubenswrapper[4747]: I1205 22:30:04.035712 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" event={"ID":"3033d749-826d-47fc-ace8-c06f765415a7","Type":"ContainerDied","Data":"53d840152ce71381b5bb953c22299b758cc5db4fda84c03ec2b0ddeab7be5ec4"} Dec 05 22:30:04 crc kubenswrapper[4747]: I1205 22:30:04.035765 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d840152ce71381b5bb953c22299b758cc5db4fda84c03ec2b0ddeab7be5ec4" Dec 05 22:30:04 crc kubenswrapper[4747]: I1205 22:30:04.035770 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd" Dec 05 22:30:04 crc kubenswrapper[4747]: I1205 22:30:04.566124 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46"] Dec 05 22:30:04 crc kubenswrapper[4747]: I1205 22:30:04.576107 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416185-hpg46"] Dec 05 22:30:05 crc kubenswrapper[4747]: I1205 22:30:05.842445 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:30:05 crc kubenswrapper[4747]: E1205 22:30:05.843011 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:30:05 crc kubenswrapper[4747]: I1205 22:30:05.861166 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40576beb-7745-4b88-9260-d1a3ba62e574" path="/var/lib/kubelet/pods/40576beb-7745-4b88-9260-d1a3ba62e574/volumes" Dec 05 22:30:12 crc kubenswrapper[4747]: I1205 22:30:12.132479 4747 generic.go:334] "Generic (PLEG): container finished" podID="5958f892-c6a2-4203-b7ef-f20a17c75771" containerID="8c852c6d49feae97a5b6e960eac62106ea499f0c335cdad775d651401253c845" exitCode=0 Dec 05 22:30:12 crc kubenswrapper[4747]: I1205 22:30:12.132568 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" event={"ID":"5958f892-c6a2-4203-b7ef-f20a17c75771","Type":"ContainerDied","Data":"8c852c6d49feae97a5b6e960eac62106ea499f0c335cdad775d651401253c845"} Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.672643 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.815283 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key\") pod \"5958f892-c6a2-4203-b7ef-f20a17c75771\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.815361 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle\") pod \"5958f892-c6a2-4203-b7ef-f20a17c75771\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.815660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory\") pod \"5958f892-c6a2-4203-b7ef-f20a17c75771\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.815768 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b76gw\" (UniqueName: \"kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw\") pod \"5958f892-c6a2-4203-b7ef-f20a17c75771\" (UID: \"5958f892-c6a2-4203-b7ef-f20a17c75771\") " Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.822100 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "5958f892-c6a2-4203-b7ef-f20a17c75771" (UID: "5958f892-c6a2-4203-b7ef-f20a17c75771"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.822235 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw" (OuterVolumeSpecName: "kube-api-access-b76gw") pod "5958f892-c6a2-4203-b7ef-f20a17c75771" (UID: "5958f892-c6a2-4203-b7ef-f20a17c75771"). InnerVolumeSpecName "kube-api-access-b76gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.846887 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory" (OuterVolumeSpecName: "inventory") pod "5958f892-c6a2-4203-b7ef-f20a17c75771" (UID: "5958f892-c6a2-4203-b7ef-f20a17c75771"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.848114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5958f892-c6a2-4203-b7ef-f20a17c75771" (UID: "5958f892-c6a2-4203-b7ef-f20a17c75771"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.919079 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.919126 4747 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.919143 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5958f892-c6a2-4203-b7ef-f20a17c75771-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:13 crc kubenswrapper[4747]: I1205 22:30:13.919159 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b76gw\" (UniqueName: \"kubernetes.io/projected/5958f892-c6a2-4203-b7ef-f20a17c75771-kube-api-access-b76gw\") on node \"crc\" DevicePath \"\"" Dec 05 22:30:14 crc kubenswrapper[4747]: I1205 22:30:14.155855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" event={"ID":"5958f892-c6a2-4203-b7ef-f20a17c75771","Type":"ContainerDied","Data":"e4aa33e52b21fde8690dafbfeb185e6ce9ca7f8ae1a87b0696a94e85d5d420b7"} Dec 05 22:30:14 crc kubenswrapper[4747]: I1205 22:30:14.156196 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4aa33e52b21fde8690dafbfeb185e6ce9ca7f8ae1a87b0696a94e85d5d420b7" Dec 05 22:30:14 crc kubenswrapper[4747]: I1205 22:30:14.155913 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7" Dec 05 22:30:17 crc kubenswrapper[4747]: I1205 22:30:17.840429 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:30:17 crc kubenswrapper[4747]: E1205 22:30:17.841225 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.958990 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77"] Dec 05 22:30:22 crc kubenswrapper[4747]: E1205 22:30:22.961039 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3033d749-826d-47fc-ace8-c06f765415a7" containerName="collect-profiles" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.961149 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3033d749-826d-47fc-ace8-c06f765415a7" containerName="collect-profiles" Dec 05 22:30:22 crc kubenswrapper[4747]: E1205 22:30:22.961253 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5958f892-c6a2-4203-b7ef-f20a17c75771" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.961327 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5958f892-c6a2-4203-b7ef-f20a17c75771" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.961828 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3033d749-826d-47fc-ace8-c06f765415a7" containerName="collect-profiles" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.961973 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5958f892-c6a2-4203-b7ef-f20a17c75771" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.963053 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.966295 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.966772 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.967007 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.967190 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:30:22 crc kubenswrapper[4747]: I1205 22:30:22.992663 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77"] Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.049758 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb8sn\" (UniqueName: \"kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.049832 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.049961 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.050540 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.152568 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb8sn\" (UniqueName: \"kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.152920 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.152945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.153024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.159164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.159501 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.161656 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.167669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb8sn\" (UniqueName: \"kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.286896 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:30:23 crc kubenswrapper[4747]: W1205 22:30:23.849239 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a0930b_16ca_4e49_a32c_d28b4a9fac9f.slice/crio-8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d WatchSource:0}: Error finding container 8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d: Status 404 returned error can't find the container with id 8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d Dec 05 22:30:23 crc kubenswrapper[4747]: I1205 22:30:23.855958 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77"] Dec 05 22:30:24 crc kubenswrapper[4747]: I1205 22:30:24.286248 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" event={"ID":"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f","Type":"ContainerStarted","Data":"8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d"} Dec 05 22:30:25 crc kubenswrapper[4747]: I1205 22:30:25.309922 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" event={"ID":"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f","Type":"ContainerStarted","Data":"cb8eaaedf103d8dc7a4edf5a9944cb1d49007b9b12f17cbe307f8de43a9bff66"} Dec 05 22:30:25 crc kubenswrapper[4747]: I1205 22:30:25.341567 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" podStartSLOduration=2.857222111 podStartE2EDuration="3.341538434s" podCreationTimestamp="2025-12-05 22:30:22 +0000 UTC" firstStartedPulling="2025-12-05 22:30:23.852087749 +0000 UTC m=+6494.319395237" lastFinishedPulling="2025-12-05 22:30:24.336404072 +0000 UTC m=+6494.803711560" observedRunningTime="2025-12-05 22:30:25.333233738 +0000 UTC m=+6495.800541226" watchObservedRunningTime="2025-12-05 22:30:25.341538434 +0000 UTC m=+6495.808845932" Dec 05 22:30:30 crc kubenswrapper[4747]: I1205 22:30:30.840230 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:30:30 crc kubenswrapper[4747]: E1205 22:30:30.841479 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:30:36 crc kubenswrapper[4747]: I1205 22:30:36.332810 4747 scope.go:117] "RemoveContainer" containerID="5905b7f242b923e7681a4a4241ce1548d4a97f588ac07a78b632068fe5ca50c1" Dec 05 22:30:36 crc kubenswrapper[4747]: I1205 22:30:36.362051 4747 scope.go:117] "RemoveContainer" containerID="48128a43c59fe40eab9735127a94de1219f7001fb84a63e45f5dfbf8e66ad266" Dec 05 22:30:36 crc kubenswrapper[4747]: I1205 22:30:36.441739 4747 scope.go:117] "RemoveContainer" containerID="efed805f4af9af8371b7974366e301514b073a1718293a542a547dcee1aa9c7d" Dec 05 22:30:36 crc kubenswrapper[4747]: I1205 22:30:36.517626 4747 scope.go:117] "RemoveContainer" containerID="14af5fea711bf2cb847f9d0ce29f78277e6069d1808621cbdd0445faaed4b373" Dec 05 22:30:44 crc kubenswrapper[4747]: I1205 22:30:44.841058 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:30:44 crc kubenswrapper[4747]: E1205 22:30:44.842232 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:30:59 crc kubenswrapper[4747]: I1205 22:30:59.850898 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:30:59 crc kubenswrapper[4747]: E1205 22:30:59.852562 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:31:11 crc kubenswrapper[4747]: I1205 22:31:11.841823 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:31:11 crc kubenswrapper[4747]: E1205 22:31:11.843274 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:31:23 crc kubenswrapper[4747]: I1205 22:31:23.055138 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-4gwlj"] Dec 05 22:31:23 crc kubenswrapper[4747]: I1205 22:31:23.068903 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-4gwlj"] Dec 05 22:31:23 crc kubenswrapper[4747]: I1205 22:31:23.855785 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f287a82-c82e-4c5f-a265-951d168c897e" path="/var/lib/kubelet/pods/8f287a82-c82e-4c5f-a265-951d168c897e/volumes" Dec 05 22:31:24 crc kubenswrapper[4747]: I1205 22:31:24.045053 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-0412-account-create-update-db9nh"] Dec 05 22:31:24 crc kubenswrapper[4747]: I1205 22:31:24.061880 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-0412-account-create-update-db9nh"] Dec 05 22:31:25 crc kubenswrapper[4747]: I1205 22:31:25.841022 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:31:25 crc kubenswrapper[4747]: E1205 22:31:25.841990 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:31:25 crc kubenswrapper[4747]: I1205 22:31:25.864210 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c60cb8d4-27a8-4c76-a378-5df868826990" path="/var/lib/kubelet/pods/c60cb8d4-27a8-4c76-a378-5df868826990/volumes" Dec 05 22:31:30 crc kubenswrapper[4747]: I1205 22:31:30.045071 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-wtgcp"] Dec 05 22:31:30 crc kubenswrapper[4747]: I1205 22:31:30.060508 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-wtgcp"] Dec 05 22:31:31 crc kubenswrapper[4747]: I1205 22:31:31.042114 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-29e1-account-create-update-dvvqp"] Dec 05 22:31:31 crc kubenswrapper[4747]: I1205 22:31:31.058923 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-29e1-account-create-update-dvvqp"] Dec 05 22:31:31 crc kubenswrapper[4747]: I1205 22:31:31.852922 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0380903-b6cc-4d0a-8119-ad6579a98860" path="/var/lib/kubelet/pods/c0380903-b6cc-4d0a-8119-ad6579a98860/volumes" Dec 05 22:31:31 crc kubenswrapper[4747]: I1205 22:31:31.854314 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efeb221c-61d1-4bbf-8ca2-fc3580444bf6" path="/var/lib/kubelet/pods/efeb221c-61d1-4bbf-8ca2-fc3580444bf6/volumes" Dec 05 22:31:36 crc kubenswrapper[4747]: I1205 22:31:36.672293 4747 scope.go:117] "RemoveContainer" containerID="c9da9bcf9364f31691ed59c8f688fa8ef2168f318a25c952937fed5d48365a5f" Dec 05 22:31:36 crc kubenswrapper[4747]: I1205 22:31:36.699289 4747 scope.go:117] "RemoveContainer" containerID="6029c0b7e282df250bbe4edadc6756c1a9bcd960b80f34d693eb406ac19c0104" Dec 05 22:31:36 crc kubenswrapper[4747]: I1205 22:31:36.752407 4747 scope.go:117] "RemoveContainer" containerID="1fb5ccd6e72466bb36e14143f02a488b1e2483bb05445806abd50baa18259e70" Dec 05 22:31:36 crc kubenswrapper[4747]: I1205 22:31:36.803218 4747 scope.go:117] "RemoveContainer" containerID="b506ff932fd84664bbed4a96b32926250aa79910c577ee60b756651632f1377d" Dec 05 22:31:39 crc kubenswrapper[4747]: I1205 22:31:39.853920 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:31:39 crc kubenswrapper[4747]: E1205 22:31:39.855513 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:31:50 crc kubenswrapper[4747]: I1205 22:31:50.840037 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:31:50 crc kubenswrapper[4747]: E1205 22:31:50.840935 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:32:02 crc kubenswrapper[4747]: I1205 22:32:02.840387 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:32:02 crc kubenswrapper[4747]: E1205 22:32:02.841259 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:32:13 crc kubenswrapper[4747]: I1205 22:32:13.839484 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:32:13 crc kubenswrapper[4747]: E1205 22:32:13.840311 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:32:15 crc kubenswrapper[4747]: I1205 22:32:15.053746 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-62hzt"] Dec 05 22:32:15 crc kubenswrapper[4747]: I1205 22:32:15.065900 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-62hzt"] Dec 05 22:32:15 crc kubenswrapper[4747]: I1205 22:32:15.868246 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff68c28-2112-446e-a942-d101afc19a5d" path="/var/lib/kubelet/pods/dff68c28-2112-446e-a942-d101afc19a5d/volumes" Dec 05 22:32:26 crc kubenswrapper[4747]: I1205 22:32:26.840352 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:32:26 crc kubenswrapper[4747]: E1205 22:32:26.842378 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:32:36 crc kubenswrapper[4747]: I1205 22:32:36.981271 4747 scope.go:117] "RemoveContainer" containerID="6b85761d555398543baa07a7bb86ed1f1ce161d1a478a8087bda0dc20e15c6f3" Dec 05 22:32:37 crc kubenswrapper[4747]: I1205 22:32:37.011637 4747 scope.go:117] "RemoveContainer" containerID="60e2eef4e7bd86021778eaf92f04fb73bc28126d400b83e391c83ec83fa7ba69" Dec 05 22:32:40 crc kubenswrapper[4747]: I1205 22:32:40.839684 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:32:40 crc kubenswrapper[4747]: E1205 22:32:40.840343 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:32:51 crc kubenswrapper[4747]: I1205 22:32:51.841569 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:32:51 crc kubenswrapper[4747]: E1205 22:32:51.843767 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:33:02 crc kubenswrapper[4747]: I1205 22:33:02.840889 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:33:02 crc kubenswrapper[4747]: E1205 22:33:02.841817 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:33:13 crc kubenswrapper[4747]: I1205 22:33:13.840314 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:33:13 crc kubenswrapper[4747]: E1205 22:33:13.841613 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:33:27 crc kubenswrapper[4747]: I1205 22:33:27.839328 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:33:27 crc kubenswrapper[4747]: E1205 22:33:27.840095 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:33:39 crc kubenswrapper[4747]: I1205 22:33:39.840570 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:33:40 crc kubenswrapper[4747]: I1205 22:33:40.837965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a"} Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.237869 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.243810 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.300942 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.327843 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.327982 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.328058 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z7tx\" (UniqueName: \"kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.430506 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.431093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.431221 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z7tx\" (UniqueName: \"kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.431441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.431503 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.454692 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z7tx\" (UniqueName: \"kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx\") pod \"community-operators-kd2sr\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:49 crc kubenswrapper[4747]: I1205 22:33:49.600914 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:50 crc kubenswrapper[4747]: W1205 22:33:50.106127 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e66d22_3666_4e7e_beb5_5dbc349b99a4.slice/crio-c864a9563532ae0bdd1df66fba06c8f6454c130eb4b941819fab62e2a83da670 WatchSource:0}: Error finding container c864a9563532ae0bdd1df66fba06c8f6454c130eb4b941819fab62e2a83da670: Status 404 returned error can't find the container with id c864a9563532ae0bdd1df66fba06c8f6454c130eb4b941819fab62e2a83da670 Dec 05 22:33:50 crc kubenswrapper[4747]: I1205 22:33:50.107883 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:33:50 crc kubenswrapper[4747]: I1205 22:33:50.956647 4747 generic.go:334] "Generic (PLEG): container finished" podID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerID="4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd" exitCode=0 Dec 05 22:33:50 crc kubenswrapper[4747]: I1205 22:33:50.956744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerDied","Data":"4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd"} Dec 05 22:33:50 crc kubenswrapper[4747]: I1205 22:33:50.957039 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerStarted","Data":"c864a9563532ae0bdd1df66fba06c8f6454c130eb4b941819fab62e2a83da670"} Dec 05 22:33:50 crc kubenswrapper[4747]: I1205 22:33:50.960726 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:33:51 crc kubenswrapper[4747]: I1205 22:33:51.967059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerStarted","Data":"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e"} Dec 05 22:33:53 crc kubenswrapper[4747]: I1205 22:33:53.993942 4747 generic.go:334] "Generic (PLEG): container finished" podID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerID="c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e" exitCode=0 Dec 05 22:33:53 crc kubenswrapper[4747]: I1205 22:33:53.994094 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerDied","Data":"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e"} Dec 05 22:33:55 crc kubenswrapper[4747]: I1205 22:33:55.010365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerStarted","Data":"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2"} Dec 05 22:33:55 crc kubenswrapper[4747]: I1205 22:33:55.042167 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kd2sr" podStartSLOduration=2.587125067 podStartE2EDuration="6.042149822s" podCreationTimestamp="2025-12-05 22:33:49 +0000 UTC" firstStartedPulling="2025-12-05 22:33:50.960367301 +0000 UTC m=+6701.427674799" lastFinishedPulling="2025-12-05 22:33:54.415392016 +0000 UTC m=+6704.882699554" observedRunningTime="2025-12-05 22:33:55.03639202 +0000 UTC m=+6705.503699508" watchObservedRunningTime="2025-12-05 22:33:55.042149822 +0000 UTC m=+6705.509457310" Dec 05 22:33:59 crc kubenswrapper[4747]: I1205 22:33:59.601770 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:59 crc kubenswrapper[4747]: I1205 22:33:59.602412 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:33:59 crc kubenswrapper[4747]: I1205 22:33:59.666840 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:34:00 crc kubenswrapper[4747]: I1205 22:34:00.169836 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:34:00 crc kubenswrapper[4747]: I1205 22:34:00.245730 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.099943 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kd2sr" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="registry-server" containerID="cri-o://647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2" gracePeriod=2 Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.702622 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.855230 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content\") pod \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.855530 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities\") pod \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.855690 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z7tx\" (UniqueName: \"kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx\") pod \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\" (UID: \"c7e66d22-3666-4e7e-beb5-5dbc349b99a4\") " Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.856443 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities" (OuterVolumeSpecName: "utilities") pod "c7e66d22-3666-4e7e-beb5-5dbc349b99a4" (UID: "c7e66d22-3666-4e7e-beb5-5dbc349b99a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.866570 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx" (OuterVolumeSpecName: "kube-api-access-2z7tx") pod "c7e66d22-3666-4e7e-beb5-5dbc349b99a4" (UID: "c7e66d22-3666-4e7e-beb5-5dbc349b99a4"). InnerVolumeSpecName "kube-api-access-2z7tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.902842 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7e66d22-3666-4e7e-beb5-5dbc349b99a4" (UID: "c7e66d22-3666-4e7e-beb5-5dbc349b99a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.957959 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.957992 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z7tx\" (UniqueName: \"kubernetes.io/projected/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-kube-api-access-2z7tx\") on node \"crc\" DevicePath \"\"" Dec 05 22:34:02 crc kubenswrapper[4747]: I1205 22:34:02.958003 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7e66d22-3666-4e7e-beb5-5dbc349b99a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.120052 4747 generic.go:334] "Generic (PLEG): container finished" podID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerID="647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2" exitCode=0 Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.120129 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerDied","Data":"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2"} Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.120152 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd2sr" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.120205 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd2sr" event={"ID":"c7e66d22-3666-4e7e-beb5-5dbc349b99a4","Type":"ContainerDied","Data":"c864a9563532ae0bdd1df66fba06c8f6454c130eb4b941819fab62e2a83da670"} Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.120246 4747 scope.go:117] "RemoveContainer" containerID="647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.164296 4747 scope.go:117] "RemoveContainer" containerID="c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.190246 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.199906 4747 scope.go:117] "RemoveContainer" containerID="4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.204244 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kd2sr"] Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.263775 4747 scope.go:117] "RemoveContainer" containerID="647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2" Dec 05 22:34:03 crc kubenswrapper[4747]: E1205 22:34:03.264424 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2\": container with ID starting with 647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2 not found: ID does not exist" containerID="647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.264480 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2"} err="failed to get container status \"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2\": rpc error: code = NotFound desc = could not find container \"647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2\": container with ID starting with 647ed0875c6d70a7c8455116fbe096e3e574a18b173a4150c32cac842b3562b2 not found: ID does not exist" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.264510 4747 scope.go:117] "RemoveContainer" containerID="c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e" Dec 05 22:34:03 crc kubenswrapper[4747]: E1205 22:34:03.265170 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e\": container with ID starting with c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e not found: ID does not exist" containerID="c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.265222 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e"} err="failed to get container status \"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e\": rpc error: code = NotFound desc = could not find container \"c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e\": container with ID starting with c6f44c16d15418e0336c595f03e29294f1ddc0dbf5b143d1d2f6dea513b6e52e not found: ID does not exist" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.265255 4747 scope.go:117] "RemoveContainer" containerID="4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd" Dec 05 22:34:03 crc kubenswrapper[4747]: E1205 22:34:03.265653 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd\": container with ID starting with 4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd not found: ID does not exist" containerID="4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.265685 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd"} err="failed to get container status \"4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd\": rpc error: code = NotFound desc = could not find container \"4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd\": container with ID starting with 4078a0a138b124c5c340558bfce08e703f6c48d70080c07bf3be3c14194edbdd not found: ID does not exist" Dec 05 22:34:03 crc kubenswrapper[4747]: I1205 22:34:03.854256 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" path="/var/lib/kubelet/pods/c7e66d22-3666-4e7e-beb5-5dbc349b99a4/volumes" Dec 05 22:35:37 crc kubenswrapper[4747]: I1205 22:35:37.179135 4747 scope.go:117] "RemoveContainer" containerID="45fc492ccb6021fa147184f4a4c3fa61aa18b3237f009988927b1152235fcce7" Dec 05 22:35:37 crc kubenswrapper[4747]: I1205 22:35:37.203062 4747 scope.go:117] "RemoveContainer" containerID="9aa3ed7ab8901d887337079052ccf492610dbd63cb34d934e369beb2d96763e0" Dec 05 22:35:42 crc kubenswrapper[4747]: I1205 22:35:42.044403 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-tcjmq"] Dec 05 22:35:42 crc kubenswrapper[4747]: I1205 22:35:42.057043 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-e619-account-create-update-zth4w"] Dec 05 22:35:42 crc kubenswrapper[4747]: I1205 22:35:42.069069 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-tcjmq"] Dec 05 22:35:42 crc kubenswrapper[4747]: I1205 22:35:42.080450 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-e619-account-create-update-zth4w"] Dec 05 22:35:43 crc kubenswrapper[4747]: I1205 22:35:43.851372 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec1e051-f8a5-463d-8f9c-bb428934e615" path="/var/lib/kubelet/pods/6ec1e051-f8a5-463d-8f9c-bb428934e615/volumes" Dec 05 22:35:43 crc kubenswrapper[4747]: I1205 22:35:43.852613 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f053ea-4de8-467e-8d7e-2cb34c061443" path="/var/lib/kubelet/pods/b5f053ea-4de8-467e-8d7e-2cb34c061443/volumes" Dec 05 22:35:54 crc kubenswrapper[4747]: I1205 22:35:54.066053 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-f9rfh"] Dec 05 22:35:54 crc kubenswrapper[4747]: I1205 22:35:54.076538 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-f9rfh"] Dec 05 22:35:55 crc kubenswrapper[4747]: I1205 22:35:55.852668 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b659051b-3448-44d2-a6f0-d8723d0ffab1" path="/var/lib/kubelet/pods/b659051b-3448-44d2-a6f0-d8723d0ffab1/volumes" Dec 05 22:36:06 crc kubenswrapper[4747]: I1205 22:36:06.222142 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:36:06 crc kubenswrapper[4747]: I1205 22:36:06.222998 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:36:36 crc kubenswrapper[4747]: I1205 22:36:36.221715 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:36:36 crc kubenswrapper[4747]: I1205 22:36:36.223892 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:36:37 crc kubenswrapper[4747]: I1205 22:36:37.272942 4747 scope.go:117] "RemoveContainer" containerID="c195241d2c901cea5f872e40f3971452903544c99e4fdea40d639d216227c84d" Dec 05 22:36:37 crc kubenswrapper[4747]: I1205 22:36:37.309193 4747 scope.go:117] "RemoveContainer" containerID="c237a1be16b8675c2ac2e3bbc7fa4204d153e681fe976968de65c60802c10cc6" Dec 05 22:36:37 crc kubenswrapper[4747]: I1205 22:36:37.389627 4747 scope.go:117] "RemoveContainer" containerID="24c679f1f2219adb3e37e4ef622bd53700f155fb937f310e515062fb34eab772" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.322508 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:01 crc kubenswrapper[4747]: E1205 22:37:01.323840 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="extract-utilities" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.323862 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="extract-utilities" Dec 05 22:37:01 crc kubenswrapper[4747]: E1205 22:37:01.323906 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="extract-content" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.323914 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="extract-content" Dec 05 22:37:01 crc kubenswrapper[4747]: E1205 22:37:01.323944 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="registry-server" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.323952 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="registry-server" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.324169 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e66d22-3666-4e7e-beb5-5dbc349b99a4" containerName="registry-server" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.326176 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.364490 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.515896 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.515947 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xbc\" (UniqueName: \"kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.516018 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.617531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.617627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xbc\" (UniqueName: \"kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.617721 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.618639 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.618683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.649766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xbc\" (UniqueName: \"kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc\") pod \"certified-operators-kp4hq\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:01 crc kubenswrapper[4747]: I1205 22:37:01.673397 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:02 crc kubenswrapper[4747]: I1205 22:37:02.216696 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:02 crc kubenswrapper[4747]: I1205 22:37:02.223626 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerStarted","Data":"8ea5dec124817fd2a6a90abb6cdf1b8625bc11867d9cb9b0a9480d04db472861"} Dec 05 22:37:03 crc kubenswrapper[4747]: I1205 22:37:03.237893 4747 generic.go:334] "Generic (PLEG): container finished" podID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerID="b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7" exitCode=0 Dec 05 22:37:03 crc kubenswrapper[4747]: I1205 22:37:03.238164 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerDied","Data":"b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7"} Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.246752 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerStarted","Data":"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e"} Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.469557 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.471966 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.485943 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.495775 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.496100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmrl\" (UniqueName: \"kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.496412 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.599401 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.599650 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.599701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmrl\" (UniqueName: \"kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.600220 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.600496 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.617918 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmrl\" (UniqueName: \"kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl\") pod \"redhat-marketplace-bkzk9\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:04 crc kubenswrapper[4747]: I1205 22:37:04.796197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.077112 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.079982 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.112041 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.131370 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.131679 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nndp\" (UniqueName: \"kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.131799 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.233955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.234070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nndp\" (UniqueName: \"kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.234109 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.234502 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.236843 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.254260 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nndp\" (UniqueName: \"kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp\") pod \"redhat-operators-z4q8v\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.352385 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:05 crc kubenswrapper[4747]: W1205 22:37:05.362208 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c190ea_a37d_4185_909e_0f75ac8773d7.slice/crio-c416ec325a2fbd99921f98d963c6e543c96584a64ca24ce6f3a95efcb894cdaf WatchSource:0}: Error finding container c416ec325a2fbd99921f98d963c6e543c96584a64ca24ce6f3a95efcb894cdaf: Status 404 returned error can't find the container with id c416ec325a2fbd99921f98d963c6e543c96584a64ca24ce6f3a95efcb894cdaf Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.407446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:05 crc kubenswrapper[4747]: I1205 22:37:05.900419 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.222350 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.222882 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.222948 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.223977 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.224053 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a" gracePeriod=600 Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.267496 4747 generic.go:334] "Generic (PLEG): container finished" podID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerID="c2bd09253408d56cb24b430224d4a8902ddf5c4fff2eddb71ef6e0b08eff47bf" exitCode=0 Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.267548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerDied","Data":"c2bd09253408d56cb24b430224d4a8902ddf5c4fff2eddb71ef6e0b08eff47bf"} Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.267636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerStarted","Data":"c416ec325a2fbd99921f98d963c6e543c96584a64ca24ce6f3a95efcb894cdaf"} Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.269877 4747 generic.go:334] "Generic (PLEG): container finished" podID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerID="06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e" exitCode=0 Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.269937 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerDied","Data":"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e"} Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.272404 4747 generic.go:334] "Generic (PLEG): container finished" podID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerID="9c1b41a9dcfb2e00bd76b30158397195e6c960417c6bedd76c6405db2160c5d3" exitCode=0 Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.272453 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerDied","Data":"9c1b41a9dcfb2e00bd76b30158397195e6c960417c6bedd76c6405db2160c5d3"} Dec 05 22:37:06 crc kubenswrapper[4747]: I1205 22:37:06.272485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerStarted","Data":"a30f84b379b5cd94a9ca3c27ae371f4120bba8be298ecf28e7a1e6c22c8c7e67"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.284957 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a" exitCode=0 Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.285029 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.285637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.285669 4747 scope.go:117] "RemoveContainer" containerID="2af74fa358ed045c01a7725660a5f8d23f920ae3ce5aa5d2fdb7bfb9ded64a5f" Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.289523 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerStarted","Data":"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.307935 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerStarted","Data":"70d2d8d27ba26314dbd9b0cb57c3dafb1957eb44c7b1960ede40c8013418bb4a"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.313957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerStarted","Data":"764f0ad873f31b292bcba7be078a5c6b0d1b198271d6cecf7fcfef31b237fa53"} Dec 05 22:37:07 crc kubenswrapper[4747]: I1205 22:37:07.335110 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kp4hq" podStartSLOduration=2.870653333 podStartE2EDuration="6.335087439s" podCreationTimestamp="2025-12-05 22:37:01 +0000 UTC" firstStartedPulling="2025-12-05 22:37:03.240238087 +0000 UTC m=+6893.707545575" lastFinishedPulling="2025-12-05 22:37:06.704672193 +0000 UTC m=+6897.171979681" observedRunningTime="2025-12-05 22:37:07.322945301 +0000 UTC m=+6897.790252779" watchObservedRunningTime="2025-12-05 22:37:07.335087439 +0000 UTC m=+6897.802394927" Dec 05 22:37:09 crc kubenswrapper[4747]: I1205 22:37:09.342238 4747 generic.go:334] "Generic (PLEG): container finished" podID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerID="764f0ad873f31b292bcba7be078a5c6b0d1b198271d6cecf7fcfef31b237fa53" exitCode=0 Dec 05 22:37:09 crc kubenswrapper[4747]: I1205 22:37:09.342753 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerDied","Data":"764f0ad873f31b292bcba7be078a5c6b0d1b198271d6cecf7fcfef31b237fa53"} Dec 05 22:37:10 crc kubenswrapper[4747]: I1205 22:37:10.357706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerStarted","Data":"38c8817e757d6b466b33473089663ac3d55bcb7e2578ffdfa22a0e097b507195"} Dec 05 22:37:10 crc kubenswrapper[4747]: I1205 22:37:10.379802 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bkzk9" podStartSLOduration=2.7361135450000003 podStartE2EDuration="6.379785281s" podCreationTimestamp="2025-12-05 22:37:04 +0000 UTC" firstStartedPulling="2025-12-05 22:37:06.269395238 +0000 UTC m=+6896.736702726" lastFinishedPulling="2025-12-05 22:37:09.913066964 +0000 UTC m=+6900.380374462" observedRunningTime="2025-12-05 22:37:10.377854584 +0000 UTC m=+6900.845162072" watchObservedRunningTime="2025-12-05 22:37:10.379785281 +0000 UTC m=+6900.847092769" Dec 05 22:37:11 crc kubenswrapper[4747]: I1205 22:37:11.370831 4747 generic.go:334] "Generic (PLEG): container finished" podID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerID="70d2d8d27ba26314dbd9b0cb57c3dafb1957eb44c7b1960ede40c8013418bb4a" exitCode=0 Dec 05 22:37:11 crc kubenswrapper[4747]: I1205 22:37:11.371015 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerDied","Data":"70d2d8d27ba26314dbd9b0cb57c3dafb1957eb44c7b1960ede40c8013418bb4a"} Dec 05 22:37:11 crc kubenswrapper[4747]: I1205 22:37:11.674107 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:11 crc kubenswrapper[4747]: I1205 22:37:11.674244 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:12 crc kubenswrapper[4747]: I1205 22:37:12.381095 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerStarted","Data":"04bb2c898b9393aae45e06ba727747edb8dddcec4ba030e3dda04b184f7418b4"} Dec 05 22:37:12 crc kubenswrapper[4747]: I1205 22:37:12.408395 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z4q8v" podStartSLOduration=1.926641582 podStartE2EDuration="7.408374669s" podCreationTimestamp="2025-12-05 22:37:05 +0000 UTC" firstStartedPulling="2025-12-05 22:37:06.275060897 +0000 UTC m=+6896.742368385" lastFinishedPulling="2025-12-05 22:37:11.756793984 +0000 UTC m=+6902.224101472" observedRunningTime="2025-12-05 22:37:12.398453716 +0000 UTC m=+6902.865761204" watchObservedRunningTime="2025-12-05 22:37:12.408374669 +0000 UTC m=+6902.875682157" Dec 05 22:37:12 crc kubenswrapper[4747]: I1205 22:37:12.726913 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kp4hq" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="registry-server" probeResult="failure" output=< Dec 05 22:37:12 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:37:12 crc kubenswrapper[4747]: > Dec 05 22:37:14 crc kubenswrapper[4747]: I1205 22:37:14.797149 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:14 crc kubenswrapper[4747]: I1205 22:37:14.797898 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:14 crc kubenswrapper[4747]: I1205 22:37:14.905859 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:15 crc kubenswrapper[4747]: I1205 22:37:15.408394 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:15 crc kubenswrapper[4747]: I1205 22:37:15.408832 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:15 crc kubenswrapper[4747]: I1205 22:37:15.505054 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:16 crc kubenswrapper[4747]: I1205 22:37:16.541625 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z4q8v" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="registry-server" probeResult="failure" output=< Dec 05 22:37:16 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:37:16 crc kubenswrapper[4747]: > Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.264873 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.265486 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bkzk9" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="registry-server" containerID="cri-o://38c8817e757d6b466b33473089663ac3d55bcb7e2578ffdfa22a0e097b507195" gracePeriod=2 Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.440302 4747 generic.go:334] "Generic (PLEG): container finished" podID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerID="38c8817e757d6b466b33473089663ac3d55bcb7e2578ffdfa22a0e097b507195" exitCode=0 Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.440404 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerDied","Data":"38c8817e757d6b466b33473089663ac3d55bcb7e2578ffdfa22a0e097b507195"} Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.821089 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.962752 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content\") pod \"18c190ea-a37d-4185-909e-0f75ac8773d7\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.963010 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities\") pod \"18c190ea-a37d-4185-909e-0f75ac8773d7\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.963300 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shmrl\" (UniqueName: \"kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl\") pod \"18c190ea-a37d-4185-909e-0f75ac8773d7\" (UID: \"18c190ea-a37d-4185-909e-0f75ac8773d7\") " Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.964698 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities" (OuterVolumeSpecName: "utilities") pod "18c190ea-a37d-4185-909e-0f75ac8773d7" (UID: "18c190ea-a37d-4185-909e-0f75ac8773d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.965446 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:18 crc kubenswrapper[4747]: I1205 22:37:18.982990 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl" (OuterVolumeSpecName: "kube-api-access-shmrl") pod "18c190ea-a37d-4185-909e-0f75ac8773d7" (UID: "18c190ea-a37d-4185-909e-0f75ac8773d7"). InnerVolumeSpecName "kube-api-access-shmrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.003428 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18c190ea-a37d-4185-909e-0f75ac8773d7" (UID: "18c190ea-a37d-4185-909e-0f75ac8773d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.068323 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shmrl\" (UniqueName: \"kubernetes.io/projected/18c190ea-a37d-4185-909e-0f75ac8773d7-kube-api-access-shmrl\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.068367 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c190ea-a37d-4185-909e-0f75ac8773d7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.451531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkzk9" event={"ID":"18c190ea-a37d-4185-909e-0f75ac8773d7","Type":"ContainerDied","Data":"c416ec325a2fbd99921f98d963c6e543c96584a64ca24ce6f3a95efcb894cdaf"} Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.451601 4747 scope.go:117] "RemoveContainer" containerID="38c8817e757d6b466b33473089663ac3d55bcb7e2578ffdfa22a0e097b507195" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.451666 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkzk9" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.490063 4747 scope.go:117] "RemoveContainer" containerID="764f0ad873f31b292bcba7be078a5c6b0d1b198271d6cecf7fcfef31b237fa53" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.498883 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.511948 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkzk9"] Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.525497 4747 scope.go:117] "RemoveContainer" containerID="c2bd09253408d56cb24b430224d4a8902ddf5c4fff2eddb71ef6e0b08eff47bf" Dec 05 22:37:19 crc kubenswrapper[4747]: I1205 22:37:19.856624 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" path="/var/lib/kubelet/pods/18c190ea-a37d-4185-909e-0f75ac8773d7/volumes" Dec 05 22:37:21 crc kubenswrapper[4747]: I1205 22:37:21.732189 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:21 crc kubenswrapper[4747]: I1205 22:37:21.901286 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:25 crc kubenswrapper[4747]: I1205 22:37:25.456661 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:25 crc kubenswrapper[4747]: I1205 22:37:25.513575 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:26 crc kubenswrapper[4747]: I1205 22:37:26.468077 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:26 crc kubenswrapper[4747]: I1205 22:37:26.468448 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kp4hq" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="registry-server" containerID="cri-o://a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab" gracePeriod=2 Dec 05 22:37:26 crc kubenswrapper[4747]: I1205 22:37:26.995834 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.156684 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content\") pod \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.156929 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities\") pod \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.157011 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9xbc\" (UniqueName: \"kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc\") pod \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\" (UID: \"9bcb76b0-9c9c-47ae-8d40-682339fcae4b\") " Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.157632 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities" (OuterVolumeSpecName: "utilities") pod "9bcb76b0-9c9c-47ae-8d40-682339fcae4b" (UID: "9bcb76b0-9c9c-47ae-8d40-682339fcae4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.158609 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.167834 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc" (OuterVolumeSpecName: "kube-api-access-g9xbc") pod "9bcb76b0-9c9c-47ae-8d40-682339fcae4b" (UID: "9bcb76b0-9c9c-47ae-8d40-682339fcae4b"). InnerVolumeSpecName "kube-api-access-g9xbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.225849 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bcb76b0-9c9c-47ae-8d40-682339fcae4b" (UID: "9bcb76b0-9c9c-47ae-8d40-682339fcae4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.261273 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.261332 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9xbc\" (UniqueName: \"kubernetes.io/projected/9bcb76b0-9c9c-47ae-8d40-682339fcae4b-kube-api-access-g9xbc\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.545391 4747 generic.go:334] "Generic (PLEG): container finished" podID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerID="a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab" exitCode=0 Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.545532 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp4hq" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.546871 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerDied","Data":"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab"} Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.547053 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp4hq" event={"ID":"9bcb76b0-9c9c-47ae-8d40-682339fcae4b","Type":"ContainerDied","Data":"8ea5dec124817fd2a6a90abb6cdf1b8625bc11867d9cb9b0a9480d04db472861"} Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.547099 4747 scope.go:117] "RemoveContainer" containerID="a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.583668 4747 scope.go:117] "RemoveContainer" containerID="06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.613118 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.629314 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kp4hq"] Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.647306 4747 scope.go:117] "RemoveContainer" containerID="b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.707366 4747 scope.go:117] "RemoveContainer" containerID="a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab" Dec 05 22:37:27 crc kubenswrapper[4747]: E1205 22:37:27.707922 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab\": container with ID starting with a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab not found: ID does not exist" containerID="a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.707975 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab"} err="failed to get container status \"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab\": rpc error: code = NotFound desc = could not find container \"a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab\": container with ID starting with a6a489d0c6079a089f9a8b993a725ff77209dec2ae595b1a89e3f413fb931dab not found: ID does not exist" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.708009 4747 scope.go:117] "RemoveContainer" containerID="06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e" Dec 05 22:37:27 crc kubenswrapper[4747]: E1205 22:37:27.708369 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e\": container with ID starting with 06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e not found: ID does not exist" containerID="06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.708409 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e"} err="failed to get container status \"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e\": rpc error: code = NotFound desc = could not find container \"06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e\": container with ID starting with 06b361e1250fc60ade73242acafa460baff6281fd0e1a1dfe9300dfef517363e not found: ID does not exist" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.708431 4747 scope.go:117] "RemoveContainer" containerID="b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7" Dec 05 22:37:27 crc kubenswrapper[4747]: E1205 22:37:27.708809 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7\": container with ID starting with b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7 not found: ID does not exist" containerID="b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.708834 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7"} err="failed to get container status \"b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7\": rpc error: code = NotFound desc = could not find container \"b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7\": container with ID starting with b6ff309534e2cdd7371e672f54c010f2c3da301785943a78ef8b29df2a7e3fc7 not found: ID does not exist" Dec 05 22:37:27 crc kubenswrapper[4747]: E1205 22:37:27.781525 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bcb76b0_9c9c_47ae_8d40_682339fcae4b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bcb76b0_9c9c_47ae_8d40_682339fcae4b.slice/crio-8ea5dec124817fd2a6a90abb6cdf1b8625bc11867d9cb9b0a9480d04db472861\": RecentStats: unable to find data in memory cache]" Dec 05 22:37:27 crc kubenswrapper[4747]: I1205 22:37:27.851237 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" path="/var/lib/kubelet/pods/9bcb76b0-9c9c-47ae-8d40-682339fcae4b/volumes" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.056882 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.058273 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z4q8v" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="registry-server" containerID="cri-o://04bb2c898b9393aae45e06ba727747edb8dddcec4ba030e3dda04b184f7418b4" gracePeriod=2 Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.577233 4747 generic.go:334] "Generic (PLEG): container finished" podID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerID="04bb2c898b9393aae45e06ba727747edb8dddcec4ba030e3dda04b184f7418b4" exitCode=0 Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.577283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerDied","Data":"04bb2c898b9393aae45e06ba727747edb8dddcec4ba030e3dda04b184f7418b4"} Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.577610 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4q8v" event={"ID":"a68c520b-ad13-4a92-bfe9-e55bd58ae22d","Type":"ContainerDied","Data":"a30f84b379b5cd94a9ca3c27ae371f4120bba8be298ecf28e7a1e6c22c8c7e67"} Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.577632 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30f84b379b5cd94a9ca3c27ae371f4120bba8be298ecf28e7a1e6c22c8c7e67" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.613989 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.643362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nndp\" (UniqueName: \"kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp\") pod \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.643571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content\") pod \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.643646 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities\") pod \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\" (UID: \"a68c520b-ad13-4a92-bfe9-e55bd58ae22d\") " Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.645364 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities" (OuterVolumeSpecName: "utilities") pod "a68c520b-ad13-4a92-bfe9-e55bd58ae22d" (UID: "a68c520b-ad13-4a92-bfe9-e55bd58ae22d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.650839 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp" (OuterVolumeSpecName: "kube-api-access-9nndp") pod "a68c520b-ad13-4a92-bfe9-e55bd58ae22d" (UID: "a68c520b-ad13-4a92-bfe9-e55bd58ae22d"). InnerVolumeSpecName "kube-api-access-9nndp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.746606 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nndp\" (UniqueName: \"kubernetes.io/projected/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-kube-api-access-9nndp\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.746887 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.762451 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a68c520b-ad13-4a92-bfe9-e55bd58ae22d" (UID: "a68c520b-ad13-4a92-bfe9-e55bd58ae22d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:37:30 crc kubenswrapper[4747]: I1205 22:37:30.849148 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68c520b-ad13-4a92-bfe9-e55bd58ae22d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:37:31 crc kubenswrapper[4747]: I1205 22:37:31.585169 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4q8v" Dec 05 22:37:31 crc kubenswrapper[4747]: I1205 22:37:31.622933 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:31 crc kubenswrapper[4747]: I1205 22:37:31.635230 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z4q8v"] Dec 05 22:37:31 crc kubenswrapper[4747]: I1205 22:37:31.887339 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" path="/var/lib/kubelet/pods/a68c520b-ad13-4a92-bfe9-e55bd58ae22d/volumes" Dec 05 22:38:28 crc kubenswrapper[4747]: I1205 22:38:28.060274 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-p2l24"] Dec 05 22:38:28 crc kubenswrapper[4747]: I1205 22:38:28.072485 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-d7a4-account-create-update-khnhr"] Dec 05 22:38:28 crc kubenswrapper[4747]: I1205 22:38:28.083535 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-p2l24"] Dec 05 22:38:28 crc kubenswrapper[4747]: I1205 22:38:28.093237 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-d7a4-account-create-update-khnhr"] Dec 05 22:38:29 crc kubenswrapper[4747]: I1205 22:38:29.858558 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06bcd857-d3bb-4c46-a694-512effef5dd4" path="/var/lib/kubelet/pods/06bcd857-d3bb-4c46-a694-512effef5dd4/volumes" Dec 05 22:38:29 crc kubenswrapper[4747]: I1205 22:38:29.859755 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66cf0233-b5cf-4f1f-abdc-a852279804f4" path="/var/lib/kubelet/pods/66cf0233-b5cf-4f1f-abdc-a852279804f4/volumes" Dec 05 22:38:37 crc kubenswrapper[4747]: I1205 22:38:37.534178 4747 scope.go:117] "RemoveContainer" containerID="317e3c661021bd1e2e29166b6603e09ead137e6dfe57d94ebb3575e3a8b28b48" Dec 05 22:38:37 crc kubenswrapper[4747]: I1205 22:38:37.579673 4747 scope.go:117] "RemoveContainer" containerID="fa5c51acd66ecd9ff96c2e4e735a5c82813ce878804f8de7afc3dca52df932dc" Dec 05 22:38:39 crc kubenswrapper[4747]: I1205 22:38:39.029083 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-qgtp5"] Dec 05 22:38:39 crc kubenswrapper[4747]: I1205 22:38:39.039997 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-qgtp5"] Dec 05 22:38:39 crc kubenswrapper[4747]: I1205 22:38:39.853643 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f270eed-adab-4d18-86c6-6321ad34b6a8" path="/var/lib/kubelet/pods/8f270eed-adab-4d18-86c6-6321ad34b6a8/volumes" Dec 05 22:39:06 crc kubenswrapper[4747]: I1205 22:39:06.222393 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:39:06 crc kubenswrapper[4747]: I1205 22:39:06.222996 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:39:36 crc kubenswrapper[4747]: I1205 22:39:36.222448 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:39:36 crc kubenswrapper[4747]: I1205 22:39:36.223219 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:39:37 crc kubenswrapper[4747]: I1205 22:39:37.719487 4747 scope.go:117] "RemoveContainer" containerID="17eb2e573028a36c93811fdaa10534b20ccdea666e6cb9a3e938b8d4a895f58f" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.222460 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.223117 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.223172 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.224184 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.224253 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" gracePeriod=600 Dec 05 22:40:06 crc kubenswrapper[4747]: E1205 22:40:06.352179 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.360355 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" exitCode=0 Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.360392 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57"} Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.360422 4747 scope.go:117] "RemoveContainer" containerID="85ea30172b876f873a38bd02c8558a9198fdf1587fb33568420bb1faa5553f9a" Dec 05 22:40:06 crc kubenswrapper[4747]: I1205 22:40:06.361233 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:40:06 crc kubenswrapper[4747]: E1205 22:40:06.361763 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:40:20 crc kubenswrapper[4747]: I1205 22:40:20.840064 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:40:20 crc kubenswrapper[4747]: E1205 22:40:20.840873 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:40:34 crc kubenswrapper[4747]: I1205 22:40:34.840438 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:40:34 crc kubenswrapper[4747]: E1205 22:40:34.841870 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:40:48 crc kubenswrapper[4747]: I1205 22:40:48.840154 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:40:48 crc kubenswrapper[4747]: E1205 22:40:48.841018 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:41:02 crc kubenswrapper[4747]: I1205 22:41:02.840559 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:41:02 crc kubenswrapper[4747]: E1205 22:41:02.841513 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:41:13 crc kubenswrapper[4747]: I1205 22:41:13.839826 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:41:13 crc kubenswrapper[4747]: E1205 22:41:13.840660 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:41:27 crc kubenswrapper[4747]: I1205 22:41:27.840720 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:41:27 crc kubenswrapper[4747]: E1205 22:41:27.841527 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:41:42 crc kubenswrapper[4747]: I1205 22:41:42.840206 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:41:42 crc kubenswrapper[4747]: E1205 22:41:42.841219 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:41:54 crc kubenswrapper[4747]: I1205 22:41:54.839296 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:41:54 crc kubenswrapper[4747]: E1205 22:41:54.840122 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:42:08 crc kubenswrapper[4747]: I1205 22:42:08.840422 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:42:08 crc kubenswrapper[4747]: E1205 22:42:08.841632 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:42:20 crc kubenswrapper[4747]: I1205 22:42:20.840360 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:42:20 crc kubenswrapper[4747]: E1205 22:42:20.842638 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:42:33 crc kubenswrapper[4747]: I1205 22:42:33.841466 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:42:33 crc kubenswrapper[4747]: E1205 22:42:33.842220 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:42:45 crc kubenswrapper[4747]: I1205 22:42:45.842560 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:42:45 crc kubenswrapper[4747]: E1205 22:42:45.843352 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:42:59 crc kubenswrapper[4747]: I1205 22:42:59.846299 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:42:59 crc kubenswrapper[4747]: E1205 22:42:59.847068 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:43:04 crc kubenswrapper[4747]: I1205 22:43:04.392992 4747 generic.go:334] "Generic (PLEG): container finished" podID="a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" containerID="cb8eaaedf103d8dc7a4edf5a9944cb1d49007b9b12f17cbe307f8de43a9bff66" exitCode=0 Dec 05 22:43:04 crc kubenswrapper[4747]: I1205 22:43:04.393064 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" event={"ID":"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f","Type":"ContainerDied","Data":"cb8eaaedf103d8dc7a4edf5a9944cb1d49007b9b12f17cbe307f8de43a9bff66"} Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.897830 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.979867 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory\") pod \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.980037 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle\") pod \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.980064 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key\") pod \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.980233 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb8sn\" (UniqueName: \"kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn\") pod \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\" (UID: \"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f\") " Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.986745 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" (UID: "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:43:05 crc kubenswrapper[4747]: I1205 22:43:05.986779 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn" (OuterVolumeSpecName: "kube-api-access-sb8sn") pod "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" (UID: "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f"). InnerVolumeSpecName "kube-api-access-sb8sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.023081 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory" (OuterVolumeSpecName: "inventory") pod "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" (UID: "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.027121 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" (UID: "a4a0930b-16ca-4e49-a32c-d28b4a9fac9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.085547 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.085630 4747 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.085653 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.085675 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb8sn\" (UniqueName: \"kubernetes.io/projected/a4a0930b-16ca-4e49-a32c-d28b4a9fac9f-kube-api-access-sb8sn\") on node \"crc\" DevicePath \"\"" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.424187 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" event={"ID":"a4a0930b-16ca-4e49-a32c-d28b4a9fac9f","Type":"ContainerDied","Data":"8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d"} Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.424240 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aadb4fb51161b18ef990fd7eb281f815a263cb250beaf42bf37497fec04782d" Dec 05 22:43:06 crc kubenswrapper[4747]: I1205 22:43:06.424309 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77" Dec 05 22:43:12 crc kubenswrapper[4747]: I1205 22:43:12.840283 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:43:12 crc kubenswrapper[4747]: E1205 22:43:12.841329 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.994480 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6mdgd"] Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995530 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995545 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995572 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995579 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995601 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995607 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995619 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995627 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995639 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995645 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995664 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995670 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995679 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995687 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="extract-utilities" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995700 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995705 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="extract-content" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995718 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995725 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 22:43:16 crc kubenswrapper[4747]: E1205 22:43:16.995734 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995739 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995910 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcb76b0-9c9c-47ae-8d40-682339fcae4b" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995922 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68c520b-ad13-4a92-bfe9-e55bd58ae22d" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995929 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0930b-16ca-4e49-a32c-d28b4a9fac9f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.995953 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c190ea-a37d-4185-909e-0f75ac8773d7" containerName="registry-server" Dec 05 22:43:16 crc kubenswrapper[4747]: I1205 22:43:16.996691 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.000078 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.000262 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.000384 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.000735 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.005436 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6mdgd"] Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.042273 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.042323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.042647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.042684 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lhs\" (UniqueName: \"kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.145189 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.145475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lhs\" (UniqueName: \"kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.145940 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.146124 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.151041 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.151246 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.155668 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.165351 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lhs\" (UniqueName: \"kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs\") pod \"bootstrap-openstack-openstack-cell1-6mdgd\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.326107 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.934834 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6mdgd"] Dec 05 22:43:17 crc kubenswrapper[4747]: I1205 22:43:17.947450 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:43:18 crc kubenswrapper[4747]: I1205 22:43:18.543384 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" event={"ID":"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995","Type":"ContainerStarted","Data":"e372f40026774a5b4ba90cad5410e0732019c75acb519729558c21dfef43f059"} Dec 05 22:43:19 crc kubenswrapper[4747]: I1205 22:43:19.553654 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" event={"ID":"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995","Type":"ContainerStarted","Data":"9bea3361ddb8ffc65ea32b326bc3b3b9a94e6b1764f22100b94762ad223a6eac"} Dec 05 22:43:19 crc kubenswrapper[4747]: I1205 22:43:19.578054 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" podStartSLOduration=3.150604291 podStartE2EDuration="3.578033344s" podCreationTimestamp="2025-12-05 22:43:16 +0000 UTC" firstStartedPulling="2025-12-05 22:43:17.94722734 +0000 UTC m=+7268.414534818" lastFinishedPulling="2025-12-05 22:43:18.374656383 +0000 UTC m=+7268.841963871" observedRunningTime="2025-12-05 22:43:19.568053149 +0000 UTC m=+7270.035360677" watchObservedRunningTime="2025-12-05 22:43:19.578033344 +0000 UTC m=+7270.045340852" Dec 05 22:43:24 crc kubenswrapper[4747]: I1205 22:43:24.840823 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:43:24 crc kubenswrapper[4747]: E1205 22:43:24.841730 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:43:37 crc kubenswrapper[4747]: I1205 22:43:37.840484 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:43:37 crc kubenswrapper[4747]: E1205 22:43:37.841458 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:43:37 crc kubenswrapper[4747]: I1205 22:43:37.843457 4747 scope.go:117] "RemoveContainer" containerID="9c1b41a9dcfb2e00bd76b30158397195e6c960417c6bedd76c6405db2160c5d3" Dec 05 22:43:37 crc kubenswrapper[4747]: I1205 22:43:37.865968 4747 scope.go:117] "RemoveContainer" containerID="04bb2c898b9393aae45e06ba727747edb8dddcec4ba030e3dda04b184f7418b4" Dec 05 22:43:37 crc kubenswrapper[4747]: I1205 22:43:37.929450 4747 scope.go:117] "RemoveContainer" containerID="70d2d8d27ba26314dbd9b0cb57c3dafb1957eb44c7b1960ede40c8013418bb4a" Dec 05 22:43:48 crc kubenswrapper[4747]: I1205 22:43:48.840484 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:43:48 crc kubenswrapper[4747]: E1205 22:43:48.841658 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:44:01 crc kubenswrapper[4747]: I1205 22:44:01.843978 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:44:01 crc kubenswrapper[4747]: E1205 22:44:01.860219 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.457849 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.461050 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.476696 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.556135 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pfr\" (UniqueName: \"kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.556263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.556391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.658531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pfr\" (UniqueName: \"kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.658605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.658641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.659018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.659039 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.678366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pfr\" (UniqueName: \"kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr\") pod \"community-operators-zwssf\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:03 crc kubenswrapper[4747]: I1205 22:44:03.790054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:04 crc kubenswrapper[4747]: I1205 22:44:04.450747 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:05 crc kubenswrapper[4747]: I1205 22:44:05.043791 4747 generic.go:334] "Generic (PLEG): container finished" podID="0e47bb03-d805-40ad-885a-15b888c9373d" containerID="0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e" exitCode=0 Dec 05 22:44:05 crc kubenswrapper[4747]: I1205 22:44:05.043851 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerDied","Data":"0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e"} Dec 05 22:44:05 crc kubenswrapper[4747]: I1205 22:44:05.044130 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerStarted","Data":"c6b4441f2bdc67383733cb00c2a853c3a37b15c7f7160f75770b36b004848af3"} Dec 05 22:44:06 crc kubenswrapper[4747]: I1205 22:44:06.053868 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerStarted","Data":"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f"} Dec 05 22:44:07 crc kubenswrapper[4747]: I1205 22:44:07.066197 4747 generic.go:334] "Generic (PLEG): container finished" podID="0e47bb03-d805-40ad-885a-15b888c9373d" containerID="d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f" exitCode=0 Dec 05 22:44:07 crc kubenswrapper[4747]: I1205 22:44:07.066244 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerDied","Data":"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f"} Dec 05 22:44:08 crc kubenswrapper[4747]: I1205 22:44:08.082059 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerStarted","Data":"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3"} Dec 05 22:44:08 crc kubenswrapper[4747]: I1205 22:44:08.115989 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwssf" podStartSLOduration=2.666481797 podStartE2EDuration="5.115962318s" podCreationTimestamp="2025-12-05 22:44:03 +0000 UTC" firstStartedPulling="2025-12-05 22:44:05.045634746 +0000 UTC m=+7315.512942244" lastFinishedPulling="2025-12-05 22:44:07.495115267 +0000 UTC m=+7317.962422765" observedRunningTime="2025-12-05 22:44:08.106520076 +0000 UTC m=+7318.573827574" watchObservedRunningTime="2025-12-05 22:44:08.115962318 +0000 UTC m=+7318.583269816" Dec 05 22:44:12 crc kubenswrapper[4747]: I1205 22:44:12.840060 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:44:12 crc kubenswrapper[4747]: E1205 22:44:12.840854 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:44:13 crc kubenswrapper[4747]: I1205 22:44:13.790501 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:13 crc kubenswrapper[4747]: I1205 22:44:13.790823 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:13 crc kubenswrapper[4747]: I1205 22:44:13.853194 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:14 crc kubenswrapper[4747]: I1205 22:44:14.202245 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:14 crc kubenswrapper[4747]: I1205 22:44:14.268023 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:16 crc kubenswrapper[4747]: I1205 22:44:16.157459 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwssf" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="registry-server" containerID="cri-o://e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3" gracePeriod=2 Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.163122 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.177144 4747 generic.go:334] "Generic (PLEG): container finished" podID="0e47bb03-d805-40ad-885a-15b888c9373d" containerID="e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3" exitCode=0 Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.177191 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerDied","Data":"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3"} Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.177234 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwssf" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.177251 4747 scope.go:117] "RemoveContainer" containerID="e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.177238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwssf" event={"ID":"0e47bb03-d805-40ad-885a-15b888c9373d","Type":"ContainerDied","Data":"c6b4441f2bdc67383733cb00c2a853c3a37b15c7f7160f75770b36b004848af3"} Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.212631 4747 scope.go:117] "RemoveContainer" containerID="d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.246991 4747 scope.go:117] "RemoveContainer" containerID="0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.276323 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content\") pod \"0e47bb03-d805-40ad-885a-15b888c9373d\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.276423 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pfr\" (UniqueName: \"kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr\") pod \"0e47bb03-d805-40ad-885a-15b888c9373d\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.276488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities\") pod \"0e47bb03-d805-40ad-885a-15b888c9373d\" (UID: \"0e47bb03-d805-40ad-885a-15b888c9373d\") " Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.277492 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities" (OuterVolumeSpecName: "utilities") pod "0e47bb03-d805-40ad-885a-15b888c9373d" (UID: "0e47bb03-d805-40ad-885a-15b888c9373d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.278108 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.282577 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr" (OuterVolumeSpecName: "kube-api-access-45pfr") pod "0e47bb03-d805-40ad-885a-15b888c9373d" (UID: "0e47bb03-d805-40ad-885a-15b888c9373d"). InnerVolumeSpecName "kube-api-access-45pfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.291495 4747 scope.go:117] "RemoveContainer" containerID="e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3" Dec 05 22:44:17 crc kubenswrapper[4747]: E1205 22:44:17.296427 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3\": container with ID starting with e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3 not found: ID does not exist" containerID="e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.296474 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3"} err="failed to get container status \"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3\": rpc error: code = NotFound desc = could not find container \"e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3\": container with ID starting with e323b73b2fee95299716a0e71a2c9ef5aaf84d3b45d1665453f5b77e821746d3 not found: ID does not exist" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.296499 4747 scope.go:117] "RemoveContainer" containerID="d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f" Dec 05 22:44:17 crc kubenswrapper[4747]: E1205 22:44:17.296878 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f\": container with ID starting with d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f not found: ID does not exist" containerID="d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.296912 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f"} err="failed to get container status \"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f\": rpc error: code = NotFound desc = could not find container \"d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f\": container with ID starting with d14b3f04fc4a66940f7daff8ae95be8563cd4098562b990d09f66a8008c1620f not found: ID does not exist" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.296930 4747 scope.go:117] "RemoveContainer" containerID="0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e" Dec 05 22:44:17 crc kubenswrapper[4747]: E1205 22:44:17.297292 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e\": container with ID starting with 0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e not found: ID does not exist" containerID="0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.297352 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e"} err="failed to get container status \"0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e\": rpc error: code = NotFound desc = could not find container \"0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e\": container with ID starting with 0eecd76b76bc07b859caa33c9ddbd22cee922a810983f3288e6aaa73cfab565e not found: ID does not exist" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.342531 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e47bb03-d805-40ad-885a-15b888c9373d" (UID: "0e47bb03-d805-40ad-885a-15b888c9373d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.380482 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e47bb03-d805-40ad-885a-15b888c9373d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.380515 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pfr\" (UniqueName: \"kubernetes.io/projected/0e47bb03-d805-40ad-885a-15b888c9373d-kube-api-access-45pfr\") on node \"crc\" DevicePath \"\"" Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.512735 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.523010 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwssf"] Dec 05 22:44:17 crc kubenswrapper[4747]: I1205 22:44:17.856804 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" path="/var/lib/kubelet/pods/0e47bb03-d805-40ad-885a-15b888c9373d/volumes" Dec 05 22:44:23 crc kubenswrapper[4747]: I1205 22:44:23.840556 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:44:23 crc kubenswrapper[4747]: E1205 22:44:23.841219 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:44:37 crc kubenswrapper[4747]: I1205 22:44:37.839961 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:44:37 crc kubenswrapper[4747]: E1205 22:44:37.840724 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:44:49 crc kubenswrapper[4747]: I1205 22:44:49.847772 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:44:49 crc kubenswrapper[4747]: E1205 22:44:49.850984 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.165684 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd"] Dec 05 22:45:00 crc kubenswrapper[4747]: E1205 22:45:00.168403 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="extract-utilities" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.168527 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="extract-utilities" Dec 05 22:45:00 crc kubenswrapper[4747]: E1205 22:45:00.168664 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="extract-content" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.168770 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="extract-content" Dec 05 22:45:00 crc kubenswrapper[4747]: E1205 22:45:00.168882 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="registry-server" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.168967 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="registry-server" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.169353 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e47bb03-d805-40ad-885a-15b888c9373d" containerName="registry-server" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.170508 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.173720 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.173986 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.178644 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd"] Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.273114 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.273213 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95698\" (UniqueName: \"kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.273372 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.375220 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.375286 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95698\" (UniqueName: \"kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.375366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.376338 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.386685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.394230 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95698\" (UniqueName: \"kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698\") pod \"collect-profiles-29416245-c92qd\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:00 crc kubenswrapper[4747]: I1205 22:45:00.510014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:01 crc kubenswrapper[4747]: I1205 22:45:01.008779 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd"] Dec 05 22:45:01 crc kubenswrapper[4747]: I1205 22:45:01.278691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" event={"ID":"87085a8d-5db8-44df-9b7f-8dfa467fca76","Type":"ContainerStarted","Data":"cfdda23a8c197df926091239b32938e66582a660d02fcc6d06f5723ff024d112"} Dec 05 22:45:01 crc kubenswrapper[4747]: I1205 22:45:01.279079 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" event={"ID":"87085a8d-5db8-44df-9b7f-8dfa467fca76","Type":"ContainerStarted","Data":"6d07922a31dca3be522ef86a55f8e2c041dc72ee1befdb26e4335ca794dcf6f5"} Dec 05 22:45:01 crc kubenswrapper[4747]: I1205 22:45:01.315428 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" podStartSLOduration=1.315408086 podStartE2EDuration="1.315408086s" podCreationTimestamp="2025-12-05 22:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 22:45:01.294270177 +0000 UTC m=+7371.761577685" watchObservedRunningTime="2025-12-05 22:45:01.315408086 +0000 UTC m=+7371.782715574" Dec 05 22:45:01 crc kubenswrapper[4747]: I1205 22:45:01.839771 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:45:01 crc kubenswrapper[4747]: E1205 22:45:01.840133 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:45:02 crc kubenswrapper[4747]: I1205 22:45:02.292815 4747 generic.go:334] "Generic (PLEG): container finished" podID="87085a8d-5db8-44df-9b7f-8dfa467fca76" containerID="cfdda23a8c197df926091239b32938e66582a660d02fcc6d06f5723ff024d112" exitCode=0 Dec 05 22:45:02 crc kubenswrapper[4747]: I1205 22:45:02.292936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" event={"ID":"87085a8d-5db8-44df-9b7f-8dfa467fca76","Type":"ContainerDied","Data":"cfdda23a8c197df926091239b32938e66582a660d02fcc6d06f5723ff024d112"} Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.679145 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.776355 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95698\" (UniqueName: \"kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698\") pod \"87085a8d-5db8-44df-9b7f-8dfa467fca76\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.776680 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume\") pod \"87085a8d-5db8-44df-9b7f-8dfa467fca76\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.776932 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume\") pod \"87085a8d-5db8-44df-9b7f-8dfa467fca76\" (UID: \"87085a8d-5db8-44df-9b7f-8dfa467fca76\") " Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.777417 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume" (OuterVolumeSpecName: "config-volume") pod "87085a8d-5db8-44df-9b7f-8dfa467fca76" (UID: "87085a8d-5db8-44df-9b7f-8dfa467fca76"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.778035 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87085a8d-5db8-44df-9b7f-8dfa467fca76-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.782267 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698" (OuterVolumeSpecName: "kube-api-access-95698") pod "87085a8d-5db8-44df-9b7f-8dfa467fca76" (UID: "87085a8d-5db8-44df-9b7f-8dfa467fca76"). InnerVolumeSpecName "kube-api-access-95698". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.794768 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87085a8d-5db8-44df-9b7f-8dfa467fca76" (UID: "87085a8d-5db8-44df-9b7f-8dfa467fca76"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.880705 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95698\" (UniqueName: \"kubernetes.io/projected/87085a8d-5db8-44df-9b7f-8dfa467fca76-kube-api-access-95698\") on node \"crc\" DevicePath \"\"" Dec 05 22:45:03 crc kubenswrapper[4747]: I1205 22:45:03.880792 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87085a8d-5db8-44df-9b7f-8dfa467fca76-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 22:45:04 crc kubenswrapper[4747]: I1205 22:45:04.322683 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" event={"ID":"87085a8d-5db8-44df-9b7f-8dfa467fca76","Type":"ContainerDied","Data":"6d07922a31dca3be522ef86a55f8e2c041dc72ee1befdb26e4335ca794dcf6f5"} Dec 05 22:45:04 crc kubenswrapper[4747]: I1205 22:45:04.323075 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d07922a31dca3be522ef86a55f8e2c041dc72ee1befdb26e4335ca794dcf6f5" Dec 05 22:45:04 crc kubenswrapper[4747]: I1205 22:45:04.322971 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd" Dec 05 22:45:04 crc kubenswrapper[4747]: I1205 22:45:04.410364 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp"] Dec 05 22:45:04 crc kubenswrapper[4747]: I1205 22:45:04.430232 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416200-55mpp"] Dec 05 22:45:05 crc kubenswrapper[4747]: I1205 22:45:05.856913 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3754eb2-0273-45eb-8121-cf06e629ea9b" path="/var/lib/kubelet/pods/d3754eb2-0273-45eb-8121-cf06e629ea9b/volumes" Dec 05 22:45:16 crc kubenswrapper[4747]: I1205 22:45:16.840775 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:45:17 crc kubenswrapper[4747]: I1205 22:45:17.466023 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a"} Dec 05 22:45:38 crc kubenswrapper[4747]: I1205 22:45:38.016133 4747 scope.go:117] "RemoveContainer" containerID="0cd9464809c8414dbd6295011a20d4651ba0666075fe742473aef8fc9962dc38" Dec 05 22:46:23 crc kubenswrapper[4747]: I1205 22:46:23.382896 4747 generic.go:334] "Generic (PLEG): container finished" podID="b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" containerID="9bea3361ddb8ffc65ea32b326bc3b3b9a94e6b1764f22100b94762ad223a6eac" exitCode=0 Dec 05 22:46:23 crc kubenswrapper[4747]: I1205 22:46:23.382990 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" event={"ID":"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995","Type":"ContainerDied","Data":"9bea3361ddb8ffc65ea32b326bc3b3b9a94e6b1764f22100b94762ad223a6eac"} Dec 05 22:46:24 crc kubenswrapper[4747]: I1205 22:46:24.842832 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.002262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory\") pod \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.002348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4lhs\" (UniqueName: \"kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs\") pod \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.002423 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key\") pod \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.002694 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle\") pod \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\" (UID: \"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995\") " Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.010987 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs" (OuterVolumeSpecName: "kube-api-access-z4lhs") pod "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" (UID: "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995"). InnerVolumeSpecName "kube-api-access-z4lhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.012837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" (UID: "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.036811 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" (UID: "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.063245 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory" (OuterVolumeSpecName: "inventory") pod "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" (UID: "b9a85f4f-ccf9-488d-bcfc-0adda4ca2995"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.106435 4747 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.106485 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.106499 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4lhs\" (UniqueName: \"kubernetes.io/projected/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-kube-api-access-z4lhs\") on node \"crc\" DevicePath \"\"" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.106512 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9a85f4f-ccf9-488d-bcfc-0adda4ca2995-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.423709 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" event={"ID":"b9a85f4f-ccf9-488d-bcfc-0adda4ca2995","Type":"ContainerDied","Data":"e372f40026774a5b4ba90cad5410e0732019c75acb519729558c21dfef43f059"} Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.423781 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e372f40026774a5b4ba90cad5410e0732019c75acb519729558c21dfef43f059" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.423779 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6mdgd" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.528412 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-8jkzz"] Dec 05 22:46:25 crc kubenswrapper[4747]: E1205 22:46:25.529629 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" containerName="bootstrap-openstack-openstack-cell1" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.529666 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" containerName="bootstrap-openstack-openstack-cell1" Dec 05 22:46:25 crc kubenswrapper[4747]: E1205 22:46:25.529703 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87085a8d-5db8-44df-9b7f-8dfa467fca76" containerName="collect-profiles" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.529717 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="87085a8d-5db8-44df-9b7f-8dfa467fca76" containerName="collect-profiles" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.530137 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a85f4f-ccf9-488d-bcfc-0adda4ca2995" containerName="bootstrap-openstack-openstack-cell1" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.530185 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="87085a8d-5db8-44df-9b7f-8dfa467fca76" containerName="collect-profiles" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.531518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.537966 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.538017 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.538223 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.538866 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.550532 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-8jkzz"] Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.720769 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.721552 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.721746 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbdg\" (UniqueName: \"kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.823828 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.830896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.831097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbdg\" (UniqueName: \"kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.853990 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.857542 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbdg\" (UniqueName: \"kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:25 crc kubenswrapper[4747]: I1205 22:46:25.866243 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-8jkzz\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:26 crc kubenswrapper[4747]: I1205 22:46:26.154460 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:46:26 crc kubenswrapper[4747]: I1205 22:46:26.706464 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-8jkzz"] Dec 05 22:46:26 crc kubenswrapper[4747]: W1205 22:46:26.709874 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f5aeb3e_a494_400c_b82b_6ff8c1c4112b.slice/crio-4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0 WatchSource:0}: Error finding container 4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0: Status 404 returned error can't find the container with id 4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0 Dec 05 22:46:27 crc kubenswrapper[4747]: I1205 22:46:27.450428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" event={"ID":"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b","Type":"ContainerStarted","Data":"4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0"} Dec 05 22:46:28 crc kubenswrapper[4747]: I1205 22:46:28.466989 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" event={"ID":"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b","Type":"ContainerStarted","Data":"90b3e27f5048d8389b1c006e41fb0740aa5503ab4969b1ab17930792a03438b9"} Dec 05 22:46:28 crc kubenswrapper[4747]: I1205 22:46:28.489469 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" podStartSLOduration=2.917988432 podStartE2EDuration="3.4894323s" podCreationTimestamp="2025-12-05 22:46:25 +0000 UTC" firstStartedPulling="2025-12-05 22:46:26.712238173 +0000 UTC m=+7457.179545671" lastFinishedPulling="2025-12-05 22:46:27.283682051 +0000 UTC m=+7457.750989539" observedRunningTime="2025-12-05 22:46:28.487776209 +0000 UTC m=+7458.955083727" watchObservedRunningTime="2025-12-05 22:46:28.4894323 +0000 UTC m=+7458.956739868" Dec 05 22:47:36 crc kubenswrapper[4747]: I1205 22:47:36.222404 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:47:36 crc kubenswrapper[4747]: I1205 22:47:36.223030 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.853980 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.857265 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.869625 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.939763 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflnc\" (UniqueName: \"kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.940132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:38 crc kubenswrapper[4747]: I1205 22:47:38.940283 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.042536 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.042685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflnc\" (UniqueName: \"kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.042746 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.043156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.043169 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.062507 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflnc\" (UniqueName: \"kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc\") pod \"redhat-marketplace-sctps\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.205445 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:39 crc kubenswrapper[4747]: I1205 22:47:39.695912 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:40 crc kubenswrapper[4747]: I1205 22:47:40.231032 4747 generic.go:334] "Generic (PLEG): container finished" podID="239457f7-44db-4b24-a29a-310ddfae68a5" containerID="db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158" exitCode=0 Dec 05 22:47:40 crc kubenswrapper[4747]: I1205 22:47:40.231138 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerDied","Data":"db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158"} Dec 05 22:47:40 crc kubenswrapper[4747]: I1205 22:47:40.231376 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerStarted","Data":"367a7630512796bd78d0fbb94add1c923c7d799deecb17f7272ac2d50badc74a"} Dec 05 22:47:41 crc kubenswrapper[4747]: I1205 22:47:41.242542 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerStarted","Data":"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df"} Dec 05 22:47:42 crc kubenswrapper[4747]: I1205 22:47:42.253175 4747 generic.go:334] "Generic (PLEG): container finished" podID="239457f7-44db-4b24-a29a-310ddfae68a5" containerID="90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df" exitCode=0 Dec 05 22:47:42 crc kubenswrapper[4747]: I1205 22:47:42.253987 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerDied","Data":"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df"} Dec 05 22:47:43 crc kubenswrapper[4747]: I1205 22:47:43.264690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerStarted","Data":"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6"} Dec 05 22:47:43 crc kubenswrapper[4747]: I1205 22:47:43.293300 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sctps" podStartSLOduration=2.875947798 podStartE2EDuration="5.293279599s" podCreationTimestamp="2025-12-05 22:47:38 +0000 UTC" firstStartedPulling="2025-12-05 22:47:40.233507347 +0000 UTC m=+7530.700814855" lastFinishedPulling="2025-12-05 22:47:42.650839168 +0000 UTC m=+7533.118146656" observedRunningTime="2025-12-05 22:47:43.285113058 +0000 UTC m=+7533.752420546" watchObservedRunningTime="2025-12-05 22:47:43.293279599 +0000 UTC m=+7533.760587087" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.444882 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.447735 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.468136 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvcgg\" (UniqueName: \"kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.468229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.468280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.471932 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.569939 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvcgg\" (UniqueName: \"kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.570008 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.570045 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.570622 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.570757 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.590594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvcgg\" (UniqueName: \"kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg\") pod \"redhat-operators-hqqx4\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:44 crc kubenswrapper[4747]: I1205 22:47:44.766783 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:45 crc kubenswrapper[4747]: I1205 22:47:45.345450 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:47:46 crc kubenswrapper[4747]: I1205 22:47:46.293812 4747 generic.go:334] "Generic (PLEG): container finished" podID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerID="9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676" exitCode=0 Dec 05 22:47:46 crc kubenswrapper[4747]: I1205 22:47:46.293915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerDied","Data":"9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676"} Dec 05 22:47:46 crc kubenswrapper[4747]: I1205 22:47:46.294122 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerStarted","Data":"a9774cdc5452bfcb761b9c480dfef982d5f4e8478921bd5bc9a1fac13ee7d42e"} Dec 05 22:47:47 crc kubenswrapper[4747]: I1205 22:47:47.304952 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerStarted","Data":"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333"} Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.206086 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.206524 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.254241 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.329175 4747 generic.go:334] "Generic (PLEG): container finished" podID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerID="24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333" exitCode=0 Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.329226 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerDied","Data":"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333"} Dec 05 22:47:49 crc kubenswrapper[4747]: I1205 22:47:49.388024 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:51 crc kubenswrapper[4747]: I1205 22:47:51.353910 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerStarted","Data":"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5"} Dec 05 22:47:51 crc kubenswrapper[4747]: I1205 22:47:51.381924 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqqx4" podStartSLOduration=3.59543896 podStartE2EDuration="7.381903121s" podCreationTimestamp="2025-12-05 22:47:44 +0000 UTC" firstStartedPulling="2025-12-05 22:47:46.29760562 +0000 UTC m=+7536.764913108" lastFinishedPulling="2025-12-05 22:47:50.084069771 +0000 UTC m=+7540.551377269" observedRunningTime="2025-12-05 22:47:51.37166195 +0000 UTC m=+7541.838969448" watchObservedRunningTime="2025-12-05 22:47:51.381903121 +0000 UTC m=+7541.849210619" Dec 05 22:47:51 crc kubenswrapper[4747]: I1205 22:47:51.438641 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:51 crc kubenswrapper[4747]: I1205 22:47:51.438995 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sctps" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="registry-server" containerID="cri-o://361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6" gracePeriod=2 Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.095340 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.252998 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content\") pod \"239457f7-44db-4b24-a29a-310ddfae68a5\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.253267 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities\") pod \"239457f7-44db-4b24-a29a-310ddfae68a5\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.253371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zflnc\" (UniqueName: \"kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc\") pod \"239457f7-44db-4b24-a29a-310ddfae68a5\" (UID: \"239457f7-44db-4b24-a29a-310ddfae68a5\") " Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.254640 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities" (OuterVolumeSpecName: "utilities") pod "239457f7-44db-4b24-a29a-310ddfae68a5" (UID: "239457f7-44db-4b24-a29a-310ddfae68a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.259289 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc" (OuterVolumeSpecName: "kube-api-access-zflnc") pod "239457f7-44db-4b24-a29a-310ddfae68a5" (UID: "239457f7-44db-4b24-a29a-310ddfae68a5"). InnerVolumeSpecName "kube-api-access-zflnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.278495 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "239457f7-44db-4b24-a29a-310ddfae68a5" (UID: "239457f7-44db-4b24-a29a-310ddfae68a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.355852 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.355890 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zflnc\" (UniqueName: \"kubernetes.io/projected/239457f7-44db-4b24-a29a-310ddfae68a5-kube-api-access-zflnc\") on node \"crc\" DevicePath \"\"" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.355906 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239457f7-44db-4b24-a29a-310ddfae68a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.365886 4747 generic.go:334] "Generic (PLEG): container finished" podID="239457f7-44db-4b24-a29a-310ddfae68a5" containerID="361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6" exitCode=0 Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.365926 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerDied","Data":"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6"} Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.365958 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sctps" event={"ID":"239457f7-44db-4b24-a29a-310ddfae68a5","Type":"ContainerDied","Data":"367a7630512796bd78d0fbb94add1c923c7d799deecb17f7272ac2d50badc74a"} Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.365975 4747 scope.go:117] "RemoveContainer" containerID="361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.365981 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sctps" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.390413 4747 scope.go:117] "RemoveContainer" containerID="90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.415388 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.431572 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sctps"] Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.444007 4747 scope.go:117] "RemoveContainer" containerID="db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.481081 4747 scope.go:117] "RemoveContainer" containerID="361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6" Dec 05 22:47:52 crc kubenswrapper[4747]: E1205 22:47:52.481638 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6\": container with ID starting with 361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6 not found: ID does not exist" containerID="361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.481681 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6"} err="failed to get container status \"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6\": rpc error: code = NotFound desc = could not find container \"361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6\": container with ID starting with 361666aa3758dc6915105db3a2e88e1c05fa901a1c8fd0b0ce8af2114b2a14d6 not found: ID does not exist" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.481713 4747 scope.go:117] "RemoveContainer" containerID="90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df" Dec 05 22:47:52 crc kubenswrapper[4747]: E1205 22:47:52.482110 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df\": container with ID starting with 90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df not found: ID does not exist" containerID="90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.482131 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df"} err="failed to get container status \"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df\": rpc error: code = NotFound desc = could not find container \"90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df\": container with ID starting with 90f696363ea87b86b6edadb1d5906bec7b7ca45f0f645792bd815b20143af7df not found: ID does not exist" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.482144 4747 scope.go:117] "RemoveContainer" containerID="db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158" Dec 05 22:47:52 crc kubenswrapper[4747]: E1205 22:47:52.482576 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158\": container with ID starting with db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158 not found: ID does not exist" containerID="db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158" Dec 05 22:47:52 crc kubenswrapper[4747]: I1205 22:47:52.482619 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158"} err="failed to get container status \"db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158\": rpc error: code = NotFound desc = could not find container \"db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158\": container with ID starting with db6dcca0f654af1df2383853870902cb34532d4f63e14f4f1d3ef9bebeee8158 not found: ID does not exist" Dec 05 22:47:53 crc kubenswrapper[4747]: I1205 22:47:53.856286 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" path="/var/lib/kubelet/pods/239457f7-44db-4b24-a29a-310ddfae68a5/volumes" Dec 05 22:47:54 crc kubenswrapper[4747]: I1205 22:47:54.767401 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:54 crc kubenswrapper[4747]: I1205 22:47:54.767779 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:47:55 crc kubenswrapper[4747]: I1205 22:47:55.814998 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqqx4" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="registry-server" probeResult="failure" output=< Dec 05 22:47:55 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:47:55 crc kubenswrapper[4747]: > Dec 05 22:47:59 crc kubenswrapper[4747]: I1205 22:47:59.454241 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" containerID="90b3e27f5048d8389b1c006e41fb0740aa5503ab4969b1ab17930792a03438b9" exitCode=0 Dec 05 22:47:59 crc kubenswrapper[4747]: I1205 22:47:59.454915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" event={"ID":"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b","Type":"ContainerDied","Data":"90b3e27f5048d8389b1c006e41fb0740aa5503ab4969b1ab17930792a03438b9"} Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.051846 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.096733 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory\") pod \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.096911 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key\") pod \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.097015 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbdg\" (UniqueName: \"kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg\") pod \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\" (UID: \"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b\") " Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.108730 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg" (OuterVolumeSpecName: "kube-api-access-wqbdg") pod "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" (UID: "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b"). InnerVolumeSpecName "kube-api-access-wqbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.148118 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory" (OuterVolumeSpecName: "inventory") pod "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" (UID: "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.157563 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" (UID: "9f5aeb3e-a494-400c-b82b-6ff8c1c4112b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.199969 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.200004 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.200017 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqbdg\" (UniqueName: \"kubernetes.io/projected/9f5aeb3e-a494-400c-b82b-6ff8c1c4112b-kube-api-access-wqbdg\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.484671 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" event={"ID":"9f5aeb3e-a494-400c-b82b-6ff8c1c4112b","Type":"ContainerDied","Data":"4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0"} Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.484725 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a6edf9a74abe5a5cc1625d0c497fe0054c53b05c08eabd6713178d677b08cd0" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.484796 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-8jkzz" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.600438 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-szjfh"] Dec 05 22:48:01 crc kubenswrapper[4747]: E1205 22:48:01.600956 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="extract-utilities" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.600978 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="extract-utilities" Dec 05 22:48:01 crc kubenswrapper[4747]: E1205 22:48:01.601001 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="extract-content" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.601009 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="extract-content" Dec 05 22:48:01 crc kubenswrapper[4747]: E1205 22:48:01.601029 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" containerName="download-cache-openstack-openstack-cell1" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.601039 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" containerName="download-cache-openstack-openstack-cell1" Dec 05 22:48:01 crc kubenswrapper[4747]: E1205 22:48:01.601069 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="registry-server" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.601078 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="registry-server" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.601352 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5aeb3e-a494-400c-b82b-6ff8c1c4112b" containerName="download-cache-openstack-openstack-cell1" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.601401 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="239457f7-44db-4b24-a29a-310ddfae68a5" containerName="registry-server" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.602288 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.605914 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.606115 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.606715 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.607120 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.609509 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.609708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.610205 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svlt\" (UniqueName: \"kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.627887 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-szjfh"] Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.711165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svlt\" (UniqueName: \"kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.711511 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.711557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.716027 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.717682 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.731201 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svlt\" (UniqueName: \"kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt\") pod \"configure-network-openstack-openstack-cell1-szjfh\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:01 crc kubenswrapper[4747]: I1205 22:48:01.929085 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:48:02 crc kubenswrapper[4747]: W1205 22:48:02.558986 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2dbd3ad_9f97_4cc0_838f_a76f2a8f9dda.slice/crio-fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677 WatchSource:0}: Error finding container fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677: Status 404 returned error can't find the container with id fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677 Dec 05 22:48:02 crc kubenswrapper[4747]: I1205 22:48:02.563001 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-szjfh"] Dec 05 22:48:03 crc kubenswrapper[4747]: I1205 22:48:03.508894 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" event={"ID":"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda","Type":"ContainerStarted","Data":"db9472e8a191f4f3826847a9e7ade7e60b3cd81a30b0b258ed9d97eb905e96ed"} Dec 05 22:48:03 crc kubenswrapper[4747]: I1205 22:48:03.509545 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" event={"ID":"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda","Type":"ContainerStarted","Data":"fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677"} Dec 05 22:48:03 crc kubenswrapper[4747]: I1205 22:48:03.539372 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" podStartSLOduration=2.125606089 podStartE2EDuration="2.539345585s" podCreationTimestamp="2025-12-05 22:48:01 +0000 UTC" firstStartedPulling="2025-12-05 22:48:02.564479115 +0000 UTC m=+7553.031786623" lastFinishedPulling="2025-12-05 22:48:02.978218631 +0000 UTC m=+7553.445526119" observedRunningTime="2025-12-05 22:48:03.535429039 +0000 UTC m=+7554.002736587" watchObservedRunningTime="2025-12-05 22:48:03.539345585 +0000 UTC m=+7554.006653103" Dec 05 22:48:04 crc kubenswrapper[4747]: I1205 22:48:04.819866 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:48:04 crc kubenswrapper[4747]: I1205 22:48:04.890807 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:48:05 crc kubenswrapper[4747]: I1205 22:48:05.058482 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:48:06 crc kubenswrapper[4747]: I1205 22:48:06.221577 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:48:06 crc kubenswrapper[4747]: I1205 22:48:06.221950 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:48:06 crc kubenswrapper[4747]: I1205 22:48:06.546537 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqqx4" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="registry-server" containerID="cri-o://0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5" gracePeriod=2 Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.194608 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.268660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvcgg\" (UniqueName: \"kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg\") pod \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.269238 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content\") pod \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.269475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities\") pod \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\" (UID: \"ea88d466-e95c-46ee-8ecd-d144d1513a2f\") " Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.270338 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities" (OuterVolumeSpecName: "utilities") pod "ea88d466-e95c-46ee-8ecd-d144d1513a2f" (UID: "ea88d466-e95c-46ee-8ecd-d144d1513a2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.271228 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.281143 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg" (OuterVolumeSpecName: "kube-api-access-pvcgg") pod "ea88d466-e95c-46ee-8ecd-d144d1513a2f" (UID: "ea88d466-e95c-46ee-8ecd-d144d1513a2f"). InnerVolumeSpecName "kube-api-access-pvcgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.373490 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvcgg\" (UniqueName: \"kubernetes.io/projected/ea88d466-e95c-46ee-8ecd-d144d1513a2f-kube-api-access-pvcgg\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.409018 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea88d466-e95c-46ee-8ecd-d144d1513a2f" (UID: "ea88d466-e95c-46ee-8ecd-d144d1513a2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.475461 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea88d466-e95c-46ee-8ecd-d144d1513a2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.559475 4747 generic.go:334] "Generic (PLEG): container finished" podID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerID="0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5" exitCode=0 Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.559541 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerDied","Data":"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5"} Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.559632 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqx4" event={"ID":"ea88d466-e95c-46ee-8ecd-d144d1513a2f","Type":"ContainerDied","Data":"a9774cdc5452bfcb761b9c480dfef982d5f4e8478921bd5bc9a1fac13ee7d42e"} Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.559656 4747 scope.go:117] "RemoveContainer" containerID="0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.559559 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqx4" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.604433 4747 scope.go:117] "RemoveContainer" containerID="24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.606567 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.611824 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqqx4"] Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.650408 4747 scope.go:117] "RemoveContainer" containerID="9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.693175 4747 scope.go:117] "RemoveContainer" containerID="0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5" Dec 05 22:48:07 crc kubenswrapper[4747]: E1205 22:48:07.693695 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5\": container with ID starting with 0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5 not found: ID does not exist" containerID="0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.693743 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5"} err="failed to get container status \"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5\": rpc error: code = NotFound desc = could not find container \"0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5\": container with ID starting with 0f93ae351008e1beb0a93c46220ddd7a9a1ad9bffd194c7c3a79b23b999bb9d5 not found: ID does not exist" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.693776 4747 scope.go:117] "RemoveContainer" containerID="24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333" Dec 05 22:48:07 crc kubenswrapper[4747]: E1205 22:48:07.694075 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333\": container with ID starting with 24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333 not found: ID does not exist" containerID="24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.694112 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333"} err="failed to get container status \"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333\": rpc error: code = NotFound desc = could not find container \"24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333\": container with ID starting with 24464eed683d86fb524e0d748e4caaa53aceaa2dc4f6017ea16e34327e7da333 not found: ID does not exist" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.694136 4747 scope.go:117] "RemoveContainer" containerID="9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676" Dec 05 22:48:07 crc kubenswrapper[4747]: E1205 22:48:07.694606 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676\": container with ID starting with 9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676 not found: ID does not exist" containerID="9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.694642 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676"} err="failed to get container status \"9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676\": rpc error: code = NotFound desc = could not find container \"9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676\": container with ID starting with 9ed47735d07c300d937d4703580010c5932ab5b50a75bc437bb620facd8ba676 not found: ID does not exist" Dec 05 22:48:07 crc kubenswrapper[4747]: I1205 22:48:07.850553 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" path="/var/lib/kubelet/pods/ea88d466-e95c-46ee-8ecd-d144d1513a2f/volumes" Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.222460 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.223350 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.223439 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.224809 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.224938 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a" gracePeriod=600 Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.908058 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a" exitCode=0 Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.908238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a"} Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.908445 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86"} Dec 05 22:48:36 crc kubenswrapper[4747]: I1205 22:48:36.908467 4747 scope.go:117] "RemoveContainer" containerID="336a759484f3a0932e6d435d8870a1e57e8611de36aed7559bcdd6bda95cab57" Dec 05 22:49:23 crc kubenswrapper[4747]: I1205 22:49:23.406209 4747 generic.go:334] "Generic (PLEG): container finished" podID="d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" containerID="db9472e8a191f4f3826847a9e7ade7e60b3cd81a30b0b258ed9d97eb905e96ed" exitCode=0 Dec 05 22:49:23 crc kubenswrapper[4747]: I1205 22:49:23.406365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" event={"ID":"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda","Type":"ContainerDied","Data":"db9472e8a191f4f3826847a9e7ade7e60b3cd81a30b0b258ed9d97eb905e96ed"} Dec 05 22:49:24 crc kubenswrapper[4747]: I1205 22:49:24.926482 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.090361 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svlt\" (UniqueName: \"kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt\") pod \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.090488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory\") pod \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.090957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key\") pod \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\" (UID: \"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda\") " Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.097353 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt" (OuterVolumeSpecName: "kube-api-access-4svlt") pod "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" (UID: "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda"). InnerVolumeSpecName "kube-api-access-4svlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.121969 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory" (OuterVolumeSpecName: "inventory") pod "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" (UID: "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.124158 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" (UID: "d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.194729 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.194780 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svlt\" (UniqueName: \"kubernetes.io/projected/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-kube-api-access-4svlt\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.194802 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.434254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" event={"ID":"d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda","Type":"ContainerDied","Data":"fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677"} Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.434301 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7835456ccf4f95edcc3083ea7af6f0be62d474c28b54cdeb84c0e11718d677" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.434338 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-szjfh" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.527820 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7wxtb"] Dec 05 22:49:25 crc kubenswrapper[4747]: E1205 22:49:25.528229 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" containerName="configure-network-openstack-openstack-cell1" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528254 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" containerName="configure-network-openstack-openstack-cell1" Dec 05 22:49:25 crc kubenswrapper[4747]: E1205 22:49:25.528276 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="extract-utilities" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528283 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="extract-utilities" Dec 05 22:49:25 crc kubenswrapper[4747]: E1205 22:49:25.528296 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="extract-content" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528302 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="extract-content" Dec 05 22:49:25 crc kubenswrapper[4747]: E1205 22:49:25.528319 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="registry-server" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528325 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="registry-server" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528541 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea88d466-e95c-46ee-8ecd-d144d1513a2f" containerName="registry-server" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.528563 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda" containerName="configure-network-openstack-openstack-cell1" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.529342 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.532566 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.532823 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.533050 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.533248 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.541738 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7wxtb"] Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.603263 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.603334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj7j7\" (UniqueName: \"kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.603718 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.705641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj7j7\" (UniqueName: \"kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.705891 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.706836 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.711562 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.711688 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.720896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj7j7\" (UniqueName: \"kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7\") pod \"validate-network-openstack-openstack-cell1-7wxtb\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:25 crc kubenswrapper[4747]: I1205 22:49:25.848867 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:26 crc kubenswrapper[4747]: I1205 22:49:26.514015 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:49:26 crc kubenswrapper[4747]: I1205 22:49:26.514501 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7wxtb"] Dec 05 22:49:27 crc kubenswrapper[4747]: I1205 22:49:27.458575 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" event={"ID":"060c3c6c-211e-4b44-9f26-1e1d24be0f83","Type":"ContainerStarted","Data":"8a51f9ac4971e7d004a57bb08e50bdf81e00aea770da8149e60decfcbe23d3d5"} Dec 05 22:49:27 crc kubenswrapper[4747]: I1205 22:49:27.458958 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" event={"ID":"060c3c6c-211e-4b44-9f26-1e1d24be0f83","Type":"ContainerStarted","Data":"45a190e195f7906c9fdb3a4b9d376a1bc5711b4dc77adbe55ab5bcdf5ef83679"} Dec 05 22:49:27 crc kubenswrapper[4747]: I1205 22:49:27.488099 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" podStartSLOduration=2.060432528 podStartE2EDuration="2.488069336s" podCreationTimestamp="2025-12-05 22:49:25 +0000 UTC" firstStartedPulling="2025-12-05 22:49:26.513454651 +0000 UTC m=+7636.980762179" lastFinishedPulling="2025-12-05 22:49:26.941091499 +0000 UTC m=+7637.408398987" observedRunningTime="2025-12-05 22:49:27.482558161 +0000 UTC m=+7637.949865649" watchObservedRunningTime="2025-12-05 22:49:27.488069336 +0000 UTC m=+7637.955376854" Dec 05 22:49:32 crc kubenswrapper[4747]: I1205 22:49:32.509383 4747 generic.go:334] "Generic (PLEG): container finished" podID="060c3c6c-211e-4b44-9f26-1e1d24be0f83" containerID="8a51f9ac4971e7d004a57bb08e50bdf81e00aea770da8149e60decfcbe23d3d5" exitCode=0 Dec 05 22:49:32 crc kubenswrapper[4747]: I1205 22:49:32.509465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" event={"ID":"060c3c6c-211e-4b44-9f26-1e1d24be0f83","Type":"ContainerDied","Data":"8a51f9ac4971e7d004a57bb08e50bdf81e00aea770da8149e60decfcbe23d3d5"} Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.002061 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.127720 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj7j7\" (UniqueName: \"kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7\") pod \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.127874 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory\") pod \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.127915 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key\") pod \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\" (UID: \"060c3c6c-211e-4b44-9f26-1e1d24be0f83\") " Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.138813 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7" (OuterVolumeSpecName: "kube-api-access-jj7j7") pod "060c3c6c-211e-4b44-9f26-1e1d24be0f83" (UID: "060c3c6c-211e-4b44-9f26-1e1d24be0f83"). InnerVolumeSpecName "kube-api-access-jj7j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.162436 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory" (OuterVolumeSpecName: "inventory") pod "060c3c6c-211e-4b44-9f26-1e1d24be0f83" (UID: "060c3c6c-211e-4b44-9f26-1e1d24be0f83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.165062 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "060c3c6c-211e-4b44-9f26-1e1d24be0f83" (UID: "060c3c6c-211e-4b44-9f26-1e1d24be0f83"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.230610 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj7j7\" (UniqueName: \"kubernetes.io/projected/060c3c6c-211e-4b44-9f26-1e1d24be0f83-kube-api-access-jj7j7\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.230648 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.230660 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/060c3c6c-211e-4b44-9f26-1e1d24be0f83-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.536710 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" event={"ID":"060c3c6c-211e-4b44-9f26-1e1d24be0f83","Type":"ContainerDied","Data":"45a190e195f7906c9fdb3a4b9d376a1bc5711b4dc77adbe55ab5bcdf5ef83679"} Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.537096 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a190e195f7906c9fdb3a4b9d376a1bc5711b4dc77adbe55ab5bcdf5ef83679" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.536741 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7wxtb" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.626754 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-xxmz9"] Dec 05 22:49:34 crc kubenswrapper[4747]: E1205 22:49:34.627292 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060c3c6c-211e-4b44-9f26-1e1d24be0f83" containerName="validate-network-openstack-openstack-cell1" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.627320 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="060c3c6c-211e-4b44-9f26-1e1d24be0f83" containerName="validate-network-openstack-openstack-cell1" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.627604 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="060c3c6c-211e-4b44-9f26-1e1d24be0f83" containerName="validate-network-openstack-openstack-cell1" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.628501 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.631457 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.631541 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.632952 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.633349 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.644000 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-xxmz9"] Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.746380 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.746550 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5lz\" (UniqueName: \"kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.746680 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.848909 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5lz\" (UniqueName: \"kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.849054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.849137 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.854480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.857029 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.877444 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5lz\" (UniqueName: \"kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz\") pod \"install-os-openstack-openstack-cell1-xxmz9\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:34 crc kubenswrapper[4747]: I1205 22:49:34.952143 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:49:35 crc kubenswrapper[4747]: I1205 22:49:35.504823 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-xxmz9"] Dec 05 22:49:35 crc kubenswrapper[4747]: I1205 22:49:35.550198 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" event={"ID":"0a44c68b-4954-4588-82a6-0a4a6b2a01c4","Type":"ContainerStarted","Data":"9a59a0ef60945a7c2086fe00852d2e65d19e9ef66994f09b2dd1d7b4715ea989"} Dec 05 22:49:37 crc kubenswrapper[4747]: I1205 22:49:37.577531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" event={"ID":"0a44c68b-4954-4588-82a6-0a4a6b2a01c4","Type":"ContainerStarted","Data":"62866ed816b8f2c20e39e676de60fb529c92f89fe2a055d2e2dc4debe0675278"} Dec 05 22:49:37 crc kubenswrapper[4747]: I1205 22:49:37.602120 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" podStartSLOduration=3.060353851 podStartE2EDuration="3.602101949s" podCreationTimestamp="2025-12-05 22:49:34 +0000 UTC" firstStartedPulling="2025-12-05 22:49:35.511888248 +0000 UTC m=+7645.979195746" lastFinishedPulling="2025-12-05 22:49:36.053636366 +0000 UTC m=+7646.520943844" observedRunningTime="2025-12-05 22:49:37.598286077 +0000 UTC m=+7648.065593605" watchObservedRunningTime="2025-12-05 22:49:37.602101949 +0000 UTC m=+7648.069409457" Dec 05 22:50:21 crc kubenswrapper[4747]: I1205 22:50:21.055248 4747 generic.go:334] "Generic (PLEG): container finished" podID="0a44c68b-4954-4588-82a6-0a4a6b2a01c4" containerID="62866ed816b8f2c20e39e676de60fb529c92f89fe2a055d2e2dc4debe0675278" exitCode=0 Dec 05 22:50:21 crc kubenswrapper[4747]: I1205 22:50:21.055314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" event={"ID":"0a44c68b-4954-4588-82a6-0a4a6b2a01c4","Type":"ContainerDied","Data":"62866ed816b8f2c20e39e676de60fb529c92f89fe2a055d2e2dc4debe0675278"} Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.553725 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.687643 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5lz\" (UniqueName: \"kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz\") pod \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.687894 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory\") pod \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.687955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key\") pod \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\" (UID: \"0a44c68b-4954-4588-82a6-0a4a6b2a01c4\") " Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.702065 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz" (OuterVolumeSpecName: "kube-api-access-nw5lz") pod "0a44c68b-4954-4588-82a6-0a4a6b2a01c4" (UID: "0a44c68b-4954-4588-82a6-0a4a6b2a01c4"). InnerVolumeSpecName "kube-api-access-nw5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.734770 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a44c68b-4954-4588-82a6-0a4a6b2a01c4" (UID: "0a44c68b-4954-4588-82a6-0a4a6b2a01c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.741771 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory" (OuterVolumeSpecName: "inventory") pod "0a44c68b-4954-4588-82a6-0a4a6b2a01c4" (UID: "0a44c68b-4954-4588-82a6-0a4a6b2a01c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.790745 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5lz\" (UniqueName: \"kubernetes.io/projected/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-kube-api-access-nw5lz\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.790793 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:22 crc kubenswrapper[4747]: I1205 22:50:22.790806 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a44c68b-4954-4588-82a6-0a4a6b2a01c4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.081840 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" event={"ID":"0a44c68b-4954-4588-82a6-0a4a6b2a01c4","Type":"ContainerDied","Data":"9a59a0ef60945a7c2086fe00852d2e65d19e9ef66994f09b2dd1d7b4715ea989"} Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.081885 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a59a0ef60945a7c2086fe00852d2e65d19e9ef66994f09b2dd1d7b4715ea989" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.081939 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-xxmz9" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.192008 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67hcg"] Dec 05 22:50:23 crc kubenswrapper[4747]: E1205 22:50:23.192413 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a44c68b-4954-4588-82a6-0a4a6b2a01c4" containerName="install-os-openstack-openstack-cell1" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.192431 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a44c68b-4954-4588-82a6-0a4a6b2a01c4" containerName="install-os-openstack-openstack-cell1" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.192660 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a44c68b-4954-4588-82a6-0a4a6b2a01c4" containerName="install-os-openstack-openstack-cell1" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.193357 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.197609 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.198001 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.200070 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.202285 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.224638 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67hcg"] Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.302819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.302891 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.302964 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxh99\" (UniqueName: \"kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.404885 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.404963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.405029 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxh99\" (UniqueName: \"kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.410912 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.415317 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.428068 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxh99\" (UniqueName: \"kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99\") pod \"configure-os-openstack-openstack-cell1-67hcg\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:23 crc kubenswrapper[4747]: I1205 22:50:23.513932 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:50:24 crc kubenswrapper[4747]: I1205 22:50:24.030110 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67hcg"] Dec 05 22:50:24 crc kubenswrapper[4747]: W1205 22:50:24.032489 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb44e95_83e2_4cf3_8157_97ad96b784d0.slice/crio-ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7 WatchSource:0}: Error finding container ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7: Status 404 returned error can't find the container with id ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7 Dec 05 22:50:24 crc kubenswrapper[4747]: I1205 22:50:24.091391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" event={"ID":"deb44e95-83e2-4cf3-8157-97ad96b784d0","Type":"ContainerStarted","Data":"ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7"} Dec 05 22:50:25 crc kubenswrapper[4747]: I1205 22:50:25.103971 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" event={"ID":"deb44e95-83e2-4cf3-8157-97ad96b784d0","Type":"ContainerStarted","Data":"f957446bcc142de22632acb2228be1f5e1852674e85cdd83c2ace31b6bd20195"} Dec 05 22:50:25 crc kubenswrapper[4747]: I1205 22:50:25.129330 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" podStartSLOduration=1.586231598 podStartE2EDuration="2.129308966s" podCreationTimestamp="2025-12-05 22:50:23 +0000 UTC" firstStartedPulling="2025-12-05 22:50:24.035352751 +0000 UTC m=+7694.502660259" lastFinishedPulling="2025-12-05 22:50:24.578430139 +0000 UTC m=+7695.045737627" observedRunningTime="2025-12-05 22:50:25.119221102 +0000 UTC m=+7695.586528650" watchObservedRunningTime="2025-12-05 22:50:25.129308966 +0000 UTC m=+7695.596616454" Dec 05 22:50:36 crc kubenswrapper[4747]: I1205 22:50:36.221992 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:50:36 crc kubenswrapper[4747]: I1205 22:50:36.222576 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.055257 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.058514 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.072353 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.093937 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.094040 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6kw\" (UniqueName: \"kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.094364 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.196506 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.197002 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.197057 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6kw\" (UniqueName: \"kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.197249 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.197432 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.216738 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6kw\" (UniqueName: \"kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw\") pod \"certified-operators-4hhlz\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.388747 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:44 crc kubenswrapper[4747]: I1205 22:50:44.971404 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:44 crc kubenswrapper[4747]: W1205 22:50:44.976632 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf3e964_783e_4918_8d50_54a896c1eb86.slice/crio-5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad WatchSource:0}: Error finding container 5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad: Status 404 returned error can't find the container with id 5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad Dec 05 22:50:45 crc kubenswrapper[4747]: I1205 22:50:45.340896 4747 generic.go:334] "Generic (PLEG): container finished" podID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerID="ad953ea048ec2b2126ea78cc592817b2e534e6d320274a022381b6930e0635c7" exitCode=0 Dec 05 22:50:45 crc kubenswrapper[4747]: I1205 22:50:45.340945 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerDied","Data":"ad953ea048ec2b2126ea78cc592817b2e534e6d320274a022381b6930e0635c7"} Dec 05 22:50:45 crc kubenswrapper[4747]: I1205 22:50:45.341182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerStarted","Data":"5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad"} Dec 05 22:50:46 crc kubenswrapper[4747]: I1205 22:50:46.360650 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerStarted","Data":"d943d5570cb1f634c3514caf2ddeefcfa4f1dd9cfd0d8b577f2fc385a557b479"} Dec 05 22:50:47 crc kubenswrapper[4747]: I1205 22:50:47.380495 4747 generic.go:334] "Generic (PLEG): container finished" podID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerID="d943d5570cb1f634c3514caf2ddeefcfa4f1dd9cfd0d8b577f2fc385a557b479" exitCode=0 Dec 05 22:50:47 crc kubenswrapper[4747]: I1205 22:50:47.380558 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerDied","Data":"d943d5570cb1f634c3514caf2ddeefcfa4f1dd9cfd0d8b577f2fc385a557b479"} Dec 05 22:50:48 crc kubenswrapper[4747]: I1205 22:50:48.392867 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerStarted","Data":"e89ce65890bef5b48387b8b029edf66c27acd1c8b466786a06c541a537dc66c0"} Dec 05 22:50:48 crc kubenswrapper[4747]: I1205 22:50:48.422402 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hhlz" podStartSLOduration=1.983071469 podStartE2EDuration="4.422376158s" podCreationTimestamp="2025-12-05 22:50:44 +0000 UTC" firstStartedPulling="2025-12-05 22:50:45.342816434 +0000 UTC m=+7715.810123932" lastFinishedPulling="2025-12-05 22:50:47.782121143 +0000 UTC m=+7718.249428621" observedRunningTime="2025-12-05 22:50:48.412670234 +0000 UTC m=+7718.879977742" watchObservedRunningTime="2025-12-05 22:50:48.422376158 +0000 UTC m=+7718.889683676" Dec 05 22:50:54 crc kubenswrapper[4747]: I1205 22:50:54.389525 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:54 crc kubenswrapper[4747]: I1205 22:50:54.390241 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:54 crc kubenswrapper[4747]: I1205 22:50:54.442886 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:54 crc kubenswrapper[4747]: I1205 22:50:54.520646 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.045506 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.046463 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hhlz" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="registry-server" containerID="cri-o://e89ce65890bef5b48387b8b029edf66c27acd1c8b466786a06c541a537dc66c0" gracePeriod=2 Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.511430 4747 generic.go:334] "Generic (PLEG): container finished" podID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerID="e89ce65890bef5b48387b8b029edf66c27acd1c8b466786a06c541a537dc66c0" exitCode=0 Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.511630 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerDied","Data":"e89ce65890bef5b48387b8b029edf66c27acd1c8b466786a06c541a537dc66c0"} Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.511990 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhlz" event={"ID":"dcf3e964-783e-4918-8d50-54a896c1eb86","Type":"ContainerDied","Data":"5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad"} Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.512016 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fdcdd027805f23a4f323d7f62f4f13462edca9779967a827403b35247b76dad" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.568267 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.649249 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p6kw\" (UniqueName: \"kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw\") pod \"dcf3e964-783e-4918-8d50-54a896c1eb86\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.649513 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities\") pod \"dcf3e964-783e-4918-8d50-54a896c1eb86\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.651243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities" (OuterVolumeSpecName: "utilities") pod "dcf3e964-783e-4918-8d50-54a896c1eb86" (UID: "dcf3e964-783e-4918-8d50-54a896c1eb86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.655302 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw" (OuterVolumeSpecName: "kube-api-access-4p6kw") pod "dcf3e964-783e-4918-8d50-54a896c1eb86" (UID: "dcf3e964-783e-4918-8d50-54a896c1eb86"). InnerVolumeSpecName "kube-api-access-4p6kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.751782 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content\") pod \"dcf3e964-783e-4918-8d50-54a896c1eb86\" (UID: \"dcf3e964-783e-4918-8d50-54a896c1eb86\") " Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.753163 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p6kw\" (UniqueName: \"kubernetes.io/projected/dcf3e964-783e-4918-8d50-54a896c1eb86-kube-api-access-4p6kw\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.753200 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.813510 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf3e964-783e-4918-8d50-54a896c1eb86" (UID: "dcf3e964-783e-4918-8d50-54a896c1eb86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:50:58 crc kubenswrapper[4747]: I1205 22:50:58.854307 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf3e964-783e-4918-8d50-54a896c1eb86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:50:59 crc kubenswrapper[4747]: I1205 22:50:59.524138 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhlz" Dec 05 22:50:59 crc kubenswrapper[4747]: I1205 22:50:59.568241 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:59 crc kubenswrapper[4747]: I1205 22:50:59.580934 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hhlz"] Dec 05 22:50:59 crc kubenswrapper[4747]: I1205 22:50:59.872842 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" path="/var/lib/kubelet/pods/dcf3e964-783e-4918-8d50-54a896c1eb86/volumes" Dec 05 22:51:06 crc kubenswrapper[4747]: I1205 22:51:06.221816 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:51:06 crc kubenswrapper[4747]: I1205 22:51:06.222542 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:51:09 crc kubenswrapper[4747]: I1205 22:51:09.635742 4747 generic.go:334] "Generic (PLEG): container finished" podID="deb44e95-83e2-4cf3-8157-97ad96b784d0" containerID="f957446bcc142de22632acb2228be1f5e1852674e85cdd83c2ace31b6bd20195" exitCode=0 Dec 05 22:51:09 crc kubenswrapper[4747]: I1205 22:51:09.636251 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" event={"ID":"deb44e95-83e2-4cf3-8157-97ad96b784d0","Type":"ContainerDied","Data":"f957446bcc142de22632acb2228be1f5e1852674e85cdd83c2ace31b6bd20195"} Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.153561 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.273883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory\") pod \"deb44e95-83e2-4cf3-8157-97ad96b784d0\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.274089 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key\") pod \"deb44e95-83e2-4cf3-8157-97ad96b784d0\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.274254 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxh99\" (UniqueName: \"kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99\") pod \"deb44e95-83e2-4cf3-8157-97ad96b784d0\" (UID: \"deb44e95-83e2-4cf3-8157-97ad96b784d0\") " Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.279873 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99" (OuterVolumeSpecName: "kube-api-access-jxh99") pod "deb44e95-83e2-4cf3-8157-97ad96b784d0" (UID: "deb44e95-83e2-4cf3-8157-97ad96b784d0"). InnerVolumeSpecName "kube-api-access-jxh99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.306719 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory" (OuterVolumeSpecName: "inventory") pod "deb44e95-83e2-4cf3-8157-97ad96b784d0" (UID: "deb44e95-83e2-4cf3-8157-97ad96b784d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.333837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "deb44e95-83e2-4cf3-8157-97ad96b784d0" (UID: "deb44e95-83e2-4cf3-8157-97ad96b784d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.376684 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.376716 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deb44e95-83e2-4cf3-8157-97ad96b784d0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.376733 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxh99\" (UniqueName: \"kubernetes.io/projected/deb44e95-83e2-4cf3-8157-97ad96b784d0-kube-api-access-jxh99\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.662524 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" event={"ID":"deb44e95-83e2-4cf3-8157-97ad96b784d0","Type":"ContainerDied","Data":"ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7"} Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.662970 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef38aa5ce7c7ec1738c8e95b3e23e8070ec7787ce285ebef5338fee5d90b5da7" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.662652 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67hcg" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.771431 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-m97hr"] Dec 05 22:51:11 crc kubenswrapper[4747]: E1205 22:51:11.772218 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="extract-utilities" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.772323 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="extract-utilities" Dec 05 22:51:11 crc kubenswrapper[4747]: E1205 22:51:11.772417 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="extract-content" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.772498 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="extract-content" Dec 05 22:51:11 crc kubenswrapper[4747]: E1205 22:51:11.772618 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="registry-server" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.772692 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="registry-server" Dec 05 22:51:11 crc kubenswrapper[4747]: E1205 22:51:11.772789 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb44e95-83e2-4cf3-8157-97ad96b784d0" containerName="configure-os-openstack-openstack-cell1" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.772868 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb44e95-83e2-4cf3-8157-97ad96b784d0" containerName="configure-os-openstack-openstack-cell1" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.773187 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb44e95-83e2-4cf3-8157-97ad96b784d0" containerName="configure-os-openstack-openstack-cell1" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.773292 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf3e964-783e-4918-8d50-54a896c1eb86" containerName="registry-server" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.774215 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.783461 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-m97hr"] Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.812250 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.812423 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.812544 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.812691 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.813983 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.814049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49296\" (UniqueName: \"kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.814102 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.915897 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49296\" (UniqueName: \"kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.915999 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.916213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.920685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.920686 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:11 crc kubenswrapper[4747]: I1205 22:51:11.932551 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49296\" (UniqueName: \"kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296\") pod \"ssh-known-hosts-openstack-m97hr\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:12 crc kubenswrapper[4747]: I1205 22:51:12.134428 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:12 crc kubenswrapper[4747]: I1205 22:51:12.769718 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-m97hr"] Dec 05 22:51:13 crc kubenswrapper[4747]: I1205 22:51:13.691431 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-m97hr" event={"ID":"fa8817f9-6605-4396-94a2-4a2106fa6724","Type":"ContainerStarted","Data":"451b844379a6d31dee1254887cc756a2ad9f9cfef172f5fbcb72d039efc956c0"} Dec 05 22:51:13 crc kubenswrapper[4747]: I1205 22:51:13.692015 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-m97hr" event={"ID":"fa8817f9-6605-4396-94a2-4a2106fa6724","Type":"ContainerStarted","Data":"d58d0a9e0ec089ddf825f441add59aa7c4fa2cf63ca080e172bb4ee6116f3b57"} Dec 05 22:51:13 crc kubenswrapper[4747]: I1205 22:51:13.717122 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-m97hr" podStartSLOduration=2.294952291 podStartE2EDuration="2.717102148s" podCreationTimestamp="2025-12-05 22:51:11 +0000 UTC" firstStartedPulling="2025-12-05 22:51:12.773183118 +0000 UTC m=+7743.240490606" lastFinishedPulling="2025-12-05 22:51:13.195332935 +0000 UTC m=+7743.662640463" observedRunningTime="2025-12-05 22:51:13.71554499 +0000 UTC m=+7744.182852498" watchObservedRunningTime="2025-12-05 22:51:13.717102148 +0000 UTC m=+7744.184409646" Dec 05 22:51:22 crc kubenswrapper[4747]: I1205 22:51:22.809629 4747 generic.go:334] "Generic (PLEG): container finished" podID="fa8817f9-6605-4396-94a2-4a2106fa6724" containerID="451b844379a6d31dee1254887cc756a2ad9f9cfef172f5fbcb72d039efc956c0" exitCode=0 Dec 05 22:51:22 crc kubenswrapper[4747]: I1205 22:51:22.809775 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-m97hr" event={"ID":"fa8817f9-6605-4396-94a2-4a2106fa6724","Type":"ContainerDied","Data":"451b844379a6d31dee1254887cc756a2ad9f9cfef172f5fbcb72d039efc956c0"} Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.306512 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.425312 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1\") pod \"fa8817f9-6605-4396-94a2-4a2106fa6724\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.425437 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0\") pod \"fa8817f9-6605-4396-94a2-4a2106fa6724\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.425517 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49296\" (UniqueName: \"kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296\") pod \"fa8817f9-6605-4396-94a2-4a2106fa6724\" (UID: \"fa8817f9-6605-4396-94a2-4a2106fa6724\") " Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.431287 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296" (OuterVolumeSpecName: "kube-api-access-49296") pod "fa8817f9-6605-4396-94a2-4a2106fa6724" (UID: "fa8817f9-6605-4396-94a2-4a2106fa6724"). InnerVolumeSpecName "kube-api-access-49296". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.454952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "fa8817f9-6605-4396-94a2-4a2106fa6724" (UID: "fa8817f9-6605-4396-94a2-4a2106fa6724"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.480138 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fa8817f9-6605-4396-94a2-4a2106fa6724" (UID: "fa8817f9-6605-4396-94a2-4a2106fa6724"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.527926 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.527959 4747 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fa8817f9-6605-4396-94a2-4a2106fa6724-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.527969 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49296\" (UniqueName: \"kubernetes.io/projected/fa8817f9-6605-4396-94a2-4a2106fa6724-kube-api-access-49296\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.833277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-m97hr" event={"ID":"fa8817f9-6605-4396-94a2-4a2106fa6724","Type":"ContainerDied","Data":"d58d0a9e0ec089ddf825f441add59aa7c4fa2cf63ca080e172bb4ee6116f3b57"} Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.833338 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58d0a9e0ec089ddf825f441add59aa7c4fa2cf63ca080e172bb4ee6116f3b57" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.833356 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-m97hr" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.927647 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bbmn4"] Dec 05 22:51:24 crc kubenswrapper[4747]: E1205 22:51:24.928269 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8817f9-6605-4396-94a2-4a2106fa6724" containerName="ssh-known-hosts-openstack" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.928296 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8817f9-6605-4396-94a2-4a2106fa6724" containerName="ssh-known-hosts-openstack" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.928570 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8817f9-6605-4396-94a2-4a2106fa6724" containerName="ssh-known-hosts-openstack" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.929522 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.933993 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.936420 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bbmn4"] Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.944686 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.946048 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:51:24 crc kubenswrapper[4747]: I1205 22:51:24.946230 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.040513 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlhrm\" (UniqueName: \"kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.040826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.040974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.143907 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlhrm\" (UniqueName: \"kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.144246 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.144371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.149435 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.151431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.177163 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlhrm\" (UniqueName: \"kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm\") pod \"run-os-openstack-openstack-cell1-bbmn4\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.248198 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.813336 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bbmn4"] Dec 05 22:51:25 crc kubenswrapper[4747]: I1205 22:51:25.863927 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" event={"ID":"91fc9e2a-45ad-4216-a340-84c55cba490e","Type":"ContainerStarted","Data":"fe95283aaf6b640ae720419dc6a0916af0fd0e1b84e11077f9386992ce06a9f2"} Dec 05 22:51:26 crc kubenswrapper[4747]: I1205 22:51:26.863970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" event={"ID":"91fc9e2a-45ad-4216-a340-84c55cba490e","Type":"ContainerStarted","Data":"868ae40505edbb56077e237ee5535aa3d021df80375b74c7d1ac737b3e4bdcba"} Dec 05 22:51:26 crc kubenswrapper[4747]: I1205 22:51:26.889639 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" podStartSLOduration=2.412428421 podStartE2EDuration="2.889620647s" podCreationTimestamp="2025-12-05 22:51:24 +0000 UTC" firstStartedPulling="2025-12-05 22:51:25.827020662 +0000 UTC m=+7756.294328150" lastFinishedPulling="2025-12-05 22:51:26.304212858 +0000 UTC m=+7756.771520376" observedRunningTime="2025-12-05 22:51:26.880890666 +0000 UTC m=+7757.348198164" watchObservedRunningTime="2025-12-05 22:51:26.889620647 +0000 UTC m=+7757.356928155" Dec 05 22:51:35 crc kubenswrapper[4747]: I1205 22:51:35.948468 4747 generic.go:334] "Generic (PLEG): container finished" podID="91fc9e2a-45ad-4216-a340-84c55cba490e" containerID="868ae40505edbb56077e237ee5535aa3d021df80375b74c7d1ac737b3e4bdcba" exitCode=0 Dec 05 22:51:35 crc kubenswrapper[4747]: I1205 22:51:35.948560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" event={"ID":"91fc9e2a-45ad-4216-a340-84c55cba490e","Type":"ContainerDied","Data":"868ae40505edbb56077e237ee5535aa3d021df80375b74c7d1ac737b3e4bdcba"} Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.222048 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.222352 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.222486 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.223256 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.223383 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" gracePeriod=600 Dec 05 22:51:36 crc kubenswrapper[4747]: E1205 22:51:36.414384 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.963663 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" exitCode=0 Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.963721 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86"} Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.963791 4747 scope.go:117] "RemoveContainer" containerID="336bd8beb0853acbee4eef9ccb58a53280fc0729cd7691408511293f6724672a" Dec 05 22:51:36 crc kubenswrapper[4747]: I1205 22:51:36.965174 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:51:36 crc kubenswrapper[4747]: E1205 22:51:36.965618 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.468255 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.553485 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory\") pod \"91fc9e2a-45ad-4216-a340-84c55cba490e\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.553549 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlhrm\" (UniqueName: \"kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm\") pod \"91fc9e2a-45ad-4216-a340-84c55cba490e\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.553604 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key\") pod \"91fc9e2a-45ad-4216-a340-84c55cba490e\" (UID: \"91fc9e2a-45ad-4216-a340-84c55cba490e\") " Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.561482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm" (OuterVolumeSpecName: "kube-api-access-xlhrm") pod "91fc9e2a-45ad-4216-a340-84c55cba490e" (UID: "91fc9e2a-45ad-4216-a340-84c55cba490e"). InnerVolumeSpecName "kube-api-access-xlhrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.588758 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory" (OuterVolumeSpecName: "inventory") pod "91fc9e2a-45ad-4216-a340-84c55cba490e" (UID: "91fc9e2a-45ad-4216-a340-84c55cba490e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.597658 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "91fc9e2a-45ad-4216-a340-84c55cba490e" (UID: "91fc9e2a-45ad-4216-a340-84c55cba490e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.655664 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.655697 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlhrm\" (UniqueName: \"kubernetes.io/projected/91fc9e2a-45ad-4216-a340-84c55cba490e-kube-api-access-xlhrm\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.659370 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91fc9e2a-45ad-4216-a340-84c55cba490e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.977434 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" event={"ID":"91fc9e2a-45ad-4216-a340-84c55cba490e","Type":"ContainerDied","Data":"fe95283aaf6b640ae720419dc6a0916af0fd0e1b84e11077f9386992ce06a9f2"} Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.977473 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe95283aaf6b640ae720419dc6a0916af0fd0e1b84e11077f9386992ce06a9f2" Dec 05 22:51:37 crc kubenswrapper[4747]: I1205 22:51:37.977495 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bbmn4" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.076106 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-8crf9"] Dec 05 22:51:38 crc kubenswrapper[4747]: E1205 22:51:38.076558 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fc9e2a-45ad-4216-a340-84c55cba490e" containerName="run-os-openstack-openstack-cell1" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.076573 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fc9e2a-45ad-4216-a340-84c55cba490e" containerName="run-os-openstack-openstack-cell1" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.076823 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fc9e2a-45ad-4216-a340-84c55cba490e" containerName="run-os-openstack-openstack-cell1" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.077526 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.079736 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.080161 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.080833 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.082336 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.086239 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-8crf9"] Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.168158 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.168447 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.168489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgl68\" (UniqueName: \"kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.270410 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.270693 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.270733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgl68\" (UniqueName: \"kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.278939 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.279503 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.293574 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgl68\" (UniqueName: \"kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68\") pod \"reboot-os-openstack-openstack-cell1-8crf9\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.415784 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:38 crc kubenswrapper[4747]: W1205 22:51:38.930315 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a9b5e3_e724_4bf3_8d1d_f99a1b8fdedb.slice/crio-033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053 WatchSource:0}: Error finding container 033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053: Status 404 returned error can't find the container with id 033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053 Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.938771 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-8crf9"] Dec 05 22:51:38 crc kubenswrapper[4747]: I1205 22:51:38.989482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" event={"ID":"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb","Type":"ContainerStarted","Data":"033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053"} Dec 05 22:51:40 crc kubenswrapper[4747]: I1205 22:51:40.004858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" event={"ID":"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb","Type":"ContainerStarted","Data":"d44a3fc08709225fc53abec3ea35d73b6400c8a96e89068aa9201454143d9172"} Dec 05 22:51:40 crc kubenswrapper[4747]: I1205 22:51:40.028398 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" podStartSLOduration=1.625264416 podStartE2EDuration="2.028380783s" podCreationTimestamp="2025-12-05 22:51:38 +0000 UTC" firstStartedPulling="2025-12-05 22:51:38.934612593 +0000 UTC m=+7769.401920081" lastFinishedPulling="2025-12-05 22:51:39.33772896 +0000 UTC m=+7769.805036448" observedRunningTime="2025-12-05 22:51:40.027613544 +0000 UTC m=+7770.494921062" watchObservedRunningTime="2025-12-05 22:51:40.028380783 +0000 UTC m=+7770.495688271" Dec 05 22:51:52 crc kubenswrapper[4747]: I1205 22:51:52.840087 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:51:52 crc kubenswrapper[4747]: E1205 22:51:52.841073 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:51:55 crc kubenswrapper[4747]: I1205 22:51:55.153117 4747 generic.go:334] "Generic (PLEG): container finished" podID="f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" containerID="d44a3fc08709225fc53abec3ea35d73b6400c8a96e89068aa9201454143d9172" exitCode=0 Dec 05 22:51:55 crc kubenswrapper[4747]: I1205 22:51:55.153222 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" event={"ID":"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb","Type":"ContainerDied","Data":"d44a3fc08709225fc53abec3ea35d73b6400c8a96e89068aa9201454143d9172"} Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.690491 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.772898 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key\") pod \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.773062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory\") pod \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.773233 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgl68\" (UniqueName: \"kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68\") pod \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\" (UID: \"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb\") " Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.781227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68" (OuterVolumeSpecName: "kube-api-access-hgl68") pod "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" (UID: "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb"). InnerVolumeSpecName "kube-api-access-hgl68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.807232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" (UID: "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.820767 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory" (OuterVolumeSpecName: "inventory") pod "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" (UID: "f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.879545 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.879575 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgl68\" (UniqueName: \"kubernetes.io/projected/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-kube-api-access-hgl68\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:56 crc kubenswrapper[4747]: I1205 22:51:56.879606 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.180643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" event={"ID":"f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb","Type":"ContainerDied","Data":"033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053"} Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.181260 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="033910b999dbfc577efda320a80e83922aadd68cdedb70cbf9ee250a8ac98053" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.180703 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-8crf9" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.267403 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5psgx"] Dec 05 22:51:57 crc kubenswrapper[4747]: E1205 22:51:57.268176 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" containerName="reboot-os-openstack-openstack-cell1" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.268293 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" containerName="reboot-os-openstack-openstack-cell1" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.268665 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb" containerName="reboot-os-openstack-openstack-cell1" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.269756 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.273137 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.273292 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.273662 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.273539 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.273618 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.274613 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.276765 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.277165 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.278192 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5psgx"] Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.392411 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.392870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.392908 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.392998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393031 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp4bv\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393050 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393080 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393105 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393136 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393160 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393185 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393217 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393237 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.393330 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.495398 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.495746 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.495852 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.495954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.496051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.496662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.496928 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.497532 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.497786 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498027 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498334 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498781 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.498957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp4bv\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.501923 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.502781 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.502975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.503256 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.503560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.504604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.505264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.505830 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.506492 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.507275 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.508768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.510175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.510665 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.516289 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.526343 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp4bv\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv\") pod \"install-certs-openstack-openstack-cell1-5psgx\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:57 crc kubenswrapper[4747]: I1205 22:51:57.605205 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:51:58 crc kubenswrapper[4747]: I1205 22:51:58.227217 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5psgx"] Dec 05 22:51:59 crc kubenswrapper[4747]: I1205 22:51:59.216824 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" event={"ID":"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3","Type":"ContainerStarted","Data":"c3e5c5dd1593a44edfa43021e5552187bb25258de79686621a811a453699c66d"} Dec 05 22:51:59 crc kubenswrapper[4747]: I1205 22:51:59.217182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" event={"ID":"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3","Type":"ContainerStarted","Data":"ab2d474caa3d18647ee78ea1233487d2e6eb28901c03a666b70125147449a01a"} Dec 05 22:51:59 crc kubenswrapper[4747]: I1205 22:51:59.242683 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" podStartSLOduration=1.813640462 podStartE2EDuration="2.242664874s" podCreationTimestamp="2025-12-05 22:51:57 +0000 UTC" firstStartedPulling="2025-12-05 22:51:58.232354872 +0000 UTC m=+7788.699662360" lastFinishedPulling="2025-12-05 22:51:58.661379264 +0000 UTC m=+7789.128686772" observedRunningTime="2025-12-05 22:51:59.233864642 +0000 UTC m=+7789.701172140" watchObservedRunningTime="2025-12-05 22:51:59.242664874 +0000 UTC m=+7789.709972362" Dec 05 22:52:04 crc kubenswrapper[4747]: I1205 22:52:04.840116 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:52:04 crc kubenswrapper[4747]: E1205 22:52:04.840848 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:52:17 crc kubenswrapper[4747]: I1205 22:52:17.840199 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:52:17 crc kubenswrapper[4747]: E1205 22:52:17.841231 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:52:32 crc kubenswrapper[4747]: I1205 22:52:32.840023 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:52:32 crc kubenswrapper[4747]: E1205 22:52:32.840952 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:52:37 crc kubenswrapper[4747]: I1205 22:52:37.668341 4747 generic.go:334] "Generic (PLEG): container finished" podID="312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" containerID="c3e5c5dd1593a44edfa43021e5552187bb25258de79686621a811a453699c66d" exitCode=0 Dec 05 22:52:37 crc kubenswrapper[4747]: I1205 22:52:37.668385 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" event={"ID":"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3","Type":"ContainerDied","Data":"c3e5c5dd1593a44edfa43021e5552187bb25258de79686621a811a453699c66d"} Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.117496 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210333 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210440 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210500 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210530 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.210546 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211328 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211375 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211404 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211425 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211500 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211522 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211569 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp4bv\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.211650 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle\") pod \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\" (UID: \"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3\") " Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.217402 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.217456 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.217506 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.217725 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.218122 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.218479 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.218565 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.218614 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.219679 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.220603 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv" (OuterVolumeSpecName: "kube-api-access-dp4bv") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "kube-api-access-dp4bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.221311 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.221405 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.223366 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.244118 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.244214 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory" (OuterVolumeSpecName: "inventory") pod "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" (UID: "312cd54a-2efe-4a57-9e2d-5f295ebfbbb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313758 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313793 4747 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313804 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313815 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313824 4747 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313849 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313859 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313869 4747 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313877 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313887 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313897 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp4bv\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-kube-api-access-dp4bv\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313907 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313916 4747 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313923 4747 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.313933 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312cd54a-2efe-4a57-9e2d-5f295ebfbbb3-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.694341 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" event={"ID":"312cd54a-2efe-4a57-9e2d-5f295ebfbbb3","Type":"ContainerDied","Data":"ab2d474caa3d18647ee78ea1233487d2e6eb28901c03a666b70125147449a01a"} Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.694752 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab2d474caa3d18647ee78ea1233487d2e6eb28901c03a666b70125147449a01a" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.694414 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5psgx" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.825440 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-n2p5x"] Dec 05 22:52:39 crc kubenswrapper[4747]: E1205 22:52:39.825926 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" containerName="install-certs-openstack-openstack-cell1" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.825945 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" containerName="install-certs-openstack-openstack-cell1" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.826157 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="312cd54a-2efe-4a57-9e2d-5f295ebfbbb3" containerName="install-certs-openstack-openstack-cell1" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.826867 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.830362 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.830534 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.830686 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.830886 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.831034 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.865054 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-n2p5x"] Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.949501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.949542 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.949611 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcst\" (UniqueName: \"kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.949643 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:39 crc kubenswrapper[4747]: I1205 22:52:39.949668 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.050959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.051013 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.051092 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcst\" (UniqueName: \"kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.051141 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.051172 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.053201 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.056992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.057213 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.058310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.067219 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcst\" (UniqueName: \"kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst\") pod \"ovn-openstack-openstack-cell1-n2p5x\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.159740 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:52:40 crc kubenswrapper[4747]: I1205 22:52:40.731599 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-n2p5x"] Dec 05 22:52:41 crc kubenswrapper[4747]: I1205 22:52:41.716419 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" event={"ID":"0937b5af-022f-433a-8c04-71d2b729f11d","Type":"ContainerStarted","Data":"510057ead2c5418686b31373f313950be96bddafb4bcb5362260de093d405043"} Dec 05 22:52:41 crc kubenswrapper[4747]: I1205 22:52:41.717252 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" event={"ID":"0937b5af-022f-433a-8c04-71d2b729f11d","Type":"ContainerStarted","Data":"fd1694c2e600074037b2c01a7b9d78afed83224e8ce724d3f9cc38c256c3d680"} Dec 05 22:52:41 crc kubenswrapper[4747]: I1205 22:52:41.746194 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" podStartSLOduration=2.269779428 podStartE2EDuration="2.746165735s" podCreationTimestamp="2025-12-05 22:52:39 +0000 UTC" firstStartedPulling="2025-12-05 22:52:40.737330087 +0000 UTC m=+7831.204637575" lastFinishedPulling="2025-12-05 22:52:41.213716394 +0000 UTC m=+7831.681023882" observedRunningTime="2025-12-05 22:52:41.739833382 +0000 UTC m=+7832.207140890" watchObservedRunningTime="2025-12-05 22:52:41.746165735 +0000 UTC m=+7832.213473263" Dec 05 22:52:43 crc kubenswrapper[4747]: I1205 22:52:43.840353 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:52:43 crc kubenswrapper[4747]: E1205 22:52:43.841271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:52:57 crc kubenswrapper[4747]: I1205 22:52:57.840341 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:52:57 crc kubenswrapper[4747]: E1205 22:52:57.841542 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:53:09 crc kubenswrapper[4747]: I1205 22:53:09.847287 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:53:09 crc kubenswrapper[4747]: E1205 22:53:09.848143 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:53:24 crc kubenswrapper[4747]: I1205 22:53:24.840519 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:53:24 crc kubenswrapper[4747]: E1205 22:53:24.841570 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:53:37 crc kubenswrapper[4747]: I1205 22:53:37.845636 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:53:37 crc kubenswrapper[4747]: E1205 22:53:37.846614 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:53:49 crc kubenswrapper[4747]: I1205 22:53:49.579650 4747 generic.go:334] "Generic (PLEG): container finished" podID="0937b5af-022f-433a-8c04-71d2b729f11d" containerID="510057ead2c5418686b31373f313950be96bddafb4bcb5362260de093d405043" exitCode=0 Dec 05 22:53:49 crc kubenswrapper[4747]: I1205 22:53:49.579717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" event={"ID":"0937b5af-022f-433a-8c04-71d2b729f11d","Type":"ContainerDied","Data":"510057ead2c5418686b31373f313950be96bddafb4bcb5362260de093d405043"} Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.067940 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.236502 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key\") pod \"0937b5af-022f-433a-8c04-71d2b729f11d\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.236568 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drcst\" (UniqueName: \"kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst\") pod \"0937b5af-022f-433a-8c04-71d2b729f11d\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.236651 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0\") pod \"0937b5af-022f-433a-8c04-71d2b729f11d\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.236684 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory\") pod \"0937b5af-022f-433a-8c04-71d2b729f11d\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.236933 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle\") pod \"0937b5af-022f-433a-8c04-71d2b729f11d\" (UID: \"0937b5af-022f-433a-8c04-71d2b729f11d\") " Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.241994 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0937b5af-022f-433a-8c04-71d2b729f11d" (UID: "0937b5af-022f-433a-8c04-71d2b729f11d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.244696 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst" (OuterVolumeSpecName: "kube-api-access-drcst") pod "0937b5af-022f-433a-8c04-71d2b729f11d" (UID: "0937b5af-022f-433a-8c04-71d2b729f11d"). InnerVolumeSpecName "kube-api-access-drcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.275006 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory" (OuterVolumeSpecName: "inventory") pod "0937b5af-022f-433a-8c04-71d2b729f11d" (UID: "0937b5af-022f-433a-8c04-71d2b729f11d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.277687 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0937b5af-022f-433a-8c04-71d2b729f11d" (UID: "0937b5af-022f-433a-8c04-71d2b729f11d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.299776 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0937b5af-022f-433a-8c04-71d2b729f11d" (UID: "0937b5af-022f-433a-8c04-71d2b729f11d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.340422 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.340464 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drcst\" (UniqueName: \"kubernetes.io/projected/0937b5af-022f-433a-8c04-71d2b729f11d-kube-api-access-drcst\") on node \"crc\" DevicePath \"\"" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.340484 4747 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0937b5af-022f-433a-8c04-71d2b729f11d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.340497 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.340508 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937b5af-022f-433a-8c04-71d2b729f11d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.609489 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" event={"ID":"0937b5af-022f-433a-8c04-71d2b729f11d","Type":"ContainerDied","Data":"fd1694c2e600074037b2c01a7b9d78afed83224e8ce724d3f9cc38c256c3d680"} Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.609826 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd1694c2e600074037b2c01a7b9d78afed83224e8ce724d3f9cc38c256c3d680" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.609615 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-n2p5x" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.712147 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-ks6lb"] Dec 05 22:53:51 crc kubenswrapper[4747]: E1205 22:53:51.712749 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0937b5af-022f-433a-8c04-71d2b729f11d" containerName="ovn-openstack-openstack-cell1" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.712776 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0937b5af-022f-433a-8c04-71d2b729f11d" containerName="ovn-openstack-openstack-cell1" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.713062 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0937b5af-022f-433a-8c04-71d2b729f11d" containerName="ovn-openstack-openstack-cell1" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.714094 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.717227 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.717379 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.717956 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.718445 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.724879 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.724925 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.732644 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-ks6lb"] Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.849333 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.849462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.849537 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.849603 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.850386 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sng6\" (UniqueName: \"kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.850480 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.952966 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sng6\" (UniqueName: \"kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.953153 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.953324 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.953557 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.953629 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.953689 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.957045 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.957259 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.957351 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.960018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.964899 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:51 crc kubenswrapper[4747]: I1205 22:53:51.978685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sng6\" (UniqueName: \"kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6\") pod \"neutron-metadata-openstack-openstack-cell1-ks6lb\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:52 crc kubenswrapper[4747]: I1205 22:53:52.056630 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:53:52 crc kubenswrapper[4747]: I1205 22:53:52.610885 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-ks6lb"] Dec 05 22:53:52 crc kubenswrapper[4747]: I1205 22:53:52.840518 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:53:52 crc kubenswrapper[4747]: E1205 22:53:52.841097 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:53:53 crc kubenswrapper[4747]: I1205 22:53:53.649625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" event={"ID":"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70","Type":"ContainerStarted","Data":"c83ad66a7b467700b8746e49bf95812edbdc146527990c871ec0e09de2f0e4c6"} Dec 05 22:53:53 crc kubenswrapper[4747]: I1205 22:53:53.649923 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" event={"ID":"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70","Type":"ContainerStarted","Data":"925d17e40042f658c3fc8c15cce11365357483df961cd9248127e5dc3f6a19b2"} Dec 05 22:53:53 crc kubenswrapper[4747]: I1205 22:53:53.670971 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" podStartSLOduration=2.173888835 podStartE2EDuration="2.670953862s" podCreationTimestamp="2025-12-05 22:53:51 +0000 UTC" firstStartedPulling="2025-12-05 22:53:52.63294182 +0000 UTC m=+7903.100249308" lastFinishedPulling="2025-12-05 22:53:53.130006847 +0000 UTC m=+7903.597314335" observedRunningTime="2025-12-05 22:53:53.66588989 +0000 UTC m=+7904.133197388" watchObservedRunningTime="2025-12-05 22:53:53.670953862 +0000 UTC m=+7904.138261350" Dec 05 22:54:03 crc kubenswrapper[4747]: I1205 22:54:03.870243 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:54:03 crc kubenswrapper[4747]: E1205 22:54:03.871043 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:54:14 crc kubenswrapper[4747]: I1205 22:54:14.840687 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:54:14 crc kubenswrapper[4747]: E1205 22:54:14.842004 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:54:25 crc kubenswrapper[4747]: I1205 22:54:25.841897 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:54:25 crc kubenswrapper[4747]: E1205 22:54:25.842907 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:54:40 crc kubenswrapper[4747]: I1205 22:54:40.844670 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:54:40 crc kubenswrapper[4747]: E1205 22:54:40.845465 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:54:48 crc kubenswrapper[4747]: I1205 22:54:48.276747 4747 generic.go:334] "Generic (PLEG): container finished" podID="3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" containerID="c83ad66a7b467700b8746e49bf95812edbdc146527990c871ec0e09de2f0e4c6" exitCode=0 Dec 05 22:54:48 crc kubenswrapper[4747]: I1205 22:54:48.276800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" event={"ID":"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70","Type":"ContainerDied","Data":"c83ad66a7b467700b8746e49bf95812edbdc146527990c871ec0e09de2f0e4c6"} Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.719465 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.783166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.783521 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.784019 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.784272 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.784441 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sng6\" (UniqueName: \"kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.784722 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0\") pod \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\" (UID: \"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70\") " Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.795920 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6" (OuterVolumeSpecName: "kube-api-access-2sng6") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "kube-api-access-2sng6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.801393 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.816404 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.826481 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.831764 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.842938 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory" (OuterVolumeSpecName: "inventory") pod "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" (UID: "3b5bf5c5-a366-40d6-a7fc-1c4517b83f70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.887691 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sng6\" (UniqueName: \"kubernetes.io/projected/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-kube-api-access-2sng6\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.887967 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.887984 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.887998 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.888013 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:49 crc kubenswrapper[4747]: I1205 22:54:49.888026 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b5bf5c5-a366-40d6-a7fc-1c4517b83f70-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.303408 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" event={"ID":"3b5bf5c5-a366-40d6-a7fc-1c4517b83f70","Type":"ContainerDied","Data":"925d17e40042f658c3fc8c15cce11365357483df961cd9248127e5dc3f6a19b2"} Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.303457 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="925d17e40042f658c3fc8c15cce11365357483df961cd9248127e5dc3f6a19b2" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.303454 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-ks6lb" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.435855 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-5z757"] Dec 05 22:54:50 crc kubenswrapper[4747]: E1205 22:54:50.436374 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.436404 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.436694 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5bf5c5-a366-40d6-a7fc-1c4517b83f70" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.437631 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.439889 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.442271 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.442288 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.442536 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.444504 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.463120 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-5z757"] Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.501490 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.501647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.501721 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df552\" (UniqueName: \"kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.501746 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.501774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.603183 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.603255 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.603320 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df552\" (UniqueName: \"kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.603349 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.603373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.606901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.607491 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.612887 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.612915 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.621296 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df552\" (UniqueName: \"kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552\") pod \"libvirt-openstack-openstack-cell1-5z757\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:50 crc kubenswrapper[4747]: I1205 22:54:50.764864 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:54:51 crc kubenswrapper[4747]: W1205 22:54:51.325161 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a42be6c_26a1_4081_8df5_9b5ee5d45262.slice/crio-bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db WatchSource:0}: Error finding container bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db: Status 404 returned error can't find the container with id bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db Dec 05 22:54:51 crc kubenswrapper[4747]: I1205 22:54:51.328129 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 22:54:51 crc kubenswrapper[4747]: I1205 22:54:51.329199 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-5z757"] Dec 05 22:54:51 crc kubenswrapper[4747]: I1205 22:54:51.840003 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:54:51 crc kubenswrapper[4747]: E1205 22:54:51.840413 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:54:52 crc kubenswrapper[4747]: I1205 22:54:52.323677 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-5z757" event={"ID":"1a42be6c-26a1-4081-8df5-9b5ee5d45262","Type":"ContainerStarted","Data":"0b904178130b4c9272a68bc3a218c840d9a829a86a774a7ba6d220165769828f"} Dec 05 22:54:52 crc kubenswrapper[4747]: I1205 22:54:52.324089 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-5z757" event={"ID":"1a42be6c-26a1-4081-8df5-9b5ee5d45262","Type":"ContainerStarted","Data":"bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db"} Dec 05 22:54:52 crc kubenswrapper[4747]: I1205 22:54:52.350990 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-5z757" podStartSLOduration=1.882706934 podStartE2EDuration="2.350968325s" podCreationTimestamp="2025-12-05 22:54:50 +0000 UTC" firstStartedPulling="2025-12-05 22:54:51.327899883 +0000 UTC m=+7961.795207371" lastFinishedPulling="2025-12-05 22:54:51.796161264 +0000 UTC m=+7962.263468762" observedRunningTime="2025-12-05 22:54:52.344228772 +0000 UTC m=+7962.811536290" watchObservedRunningTime="2025-12-05 22:54:52.350968325 +0000 UTC m=+7962.818275823" Dec 05 22:55:06 crc kubenswrapper[4747]: I1205 22:55:06.840198 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:55:06 crc kubenswrapper[4747]: E1205 22:55:06.841119 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:55:17 crc kubenswrapper[4747]: I1205 22:55:17.839782 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:55:17 crc kubenswrapper[4747]: E1205 22:55:17.840513 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:55:32 crc kubenswrapper[4747]: I1205 22:55:32.839940 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:55:32 crc kubenswrapper[4747]: E1205 22:55:32.841031 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:55:45 crc kubenswrapper[4747]: I1205 22:55:45.841727 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:55:45 crc kubenswrapper[4747]: E1205 22:55:45.845621 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:56:00 crc kubenswrapper[4747]: I1205 22:56:00.840840 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:56:00 crc kubenswrapper[4747]: E1205 22:56:00.841978 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:56:12 crc kubenswrapper[4747]: I1205 22:56:12.840105 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:56:12 crc kubenswrapper[4747]: E1205 22:56:12.840926 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:56:27 crc kubenswrapper[4747]: I1205 22:56:27.841372 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:56:27 crc kubenswrapper[4747]: E1205 22:56:27.842140 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 22:56:42 crc kubenswrapper[4747]: I1205 22:56:42.840013 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 22:56:43 crc kubenswrapper[4747]: I1205 22:56:43.680525 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc"} Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.895346 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.898677 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.912632 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.965538 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mml\" (UniqueName: \"kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.965708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:20 crc kubenswrapper[4747]: I1205 22:57:20.965739 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.067830 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.067879 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.068066 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mml\" (UniqueName: \"kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.068540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.068696 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.093739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mml\" (UniqueName: \"kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml\") pod \"community-operators-cnfnz\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.228714 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:21 crc kubenswrapper[4747]: I1205 22:57:21.628676 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:22 crc kubenswrapper[4747]: I1205 22:57:22.120424 4747 generic.go:334] "Generic (PLEG): container finished" podID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerID="b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239" exitCode=0 Dec 05 22:57:22 crc kubenswrapper[4747]: I1205 22:57:22.120497 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerDied","Data":"b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239"} Dec 05 22:57:22 crc kubenswrapper[4747]: I1205 22:57:22.120854 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerStarted","Data":"3637bebcb573322245374ef4e96195470e94ac8780fff3c6db70994239001ab7"} Dec 05 22:57:24 crc kubenswrapper[4747]: I1205 22:57:24.161382 4747 generic.go:334] "Generic (PLEG): container finished" podID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerID="f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae" exitCode=0 Dec 05 22:57:24 crc kubenswrapper[4747]: I1205 22:57:24.161470 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerDied","Data":"f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae"} Dec 05 22:57:25 crc kubenswrapper[4747]: I1205 22:57:25.180150 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerStarted","Data":"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84"} Dec 05 22:57:25 crc kubenswrapper[4747]: I1205 22:57:25.214649 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnfnz" podStartSLOduration=2.772744627 podStartE2EDuration="5.214626549s" podCreationTimestamp="2025-12-05 22:57:20 +0000 UTC" firstStartedPulling="2025-12-05 22:57:22.122019081 +0000 UTC m=+8112.589326569" lastFinishedPulling="2025-12-05 22:57:24.563900983 +0000 UTC m=+8115.031208491" observedRunningTime="2025-12-05 22:57:25.207108758 +0000 UTC m=+8115.674416256" watchObservedRunningTime="2025-12-05 22:57:25.214626549 +0000 UTC m=+8115.681934047" Dec 05 22:57:31 crc kubenswrapper[4747]: I1205 22:57:31.229061 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:31 crc kubenswrapper[4747]: I1205 22:57:31.229649 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:31 crc kubenswrapper[4747]: I1205 22:57:31.284184 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:31 crc kubenswrapper[4747]: I1205 22:57:31.334972 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:31 crc kubenswrapper[4747]: I1205 22:57:31.529799 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:33 crc kubenswrapper[4747]: I1205 22:57:33.262996 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cnfnz" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="registry-server" containerID="cri-o://de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84" gracePeriod=2 Dec 05 22:57:33 crc kubenswrapper[4747]: I1205 22:57:33.868573 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.058776 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7mml\" (UniqueName: \"kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml\") pod \"7cee91b2-5208-4643-bfff-fa66cbf8c113\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.059218 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities\") pod \"7cee91b2-5208-4643-bfff-fa66cbf8c113\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.059461 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content\") pod \"7cee91b2-5208-4643-bfff-fa66cbf8c113\" (UID: \"7cee91b2-5208-4643-bfff-fa66cbf8c113\") " Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.060006 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities" (OuterVolumeSpecName: "utilities") pod "7cee91b2-5208-4643-bfff-fa66cbf8c113" (UID: "7cee91b2-5208-4643-bfff-fa66cbf8c113"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.066832 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml" (OuterVolumeSpecName: "kube-api-access-l7mml") pod "7cee91b2-5208-4643-bfff-fa66cbf8c113" (UID: "7cee91b2-5208-4643-bfff-fa66cbf8c113"). InnerVolumeSpecName "kube-api-access-l7mml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.120488 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cee91b2-5208-4643-bfff-fa66cbf8c113" (UID: "7cee91b2-5208-4643-bfff-fa66cbf8c113"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.164253 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.164290 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cee91b2-5208-4643-bfff-fa66cbf8c113-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.164303 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7mml\" (UniqueName: \"kubernetes.io/projected/7cee91b2-5208-4643-bfff-fa66cbf8c113-kube-api-access-l7mml\") on node \"crc\" DevicePath \"\"" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.273342 4747 generic.go:334] "Generic (PLEG): container finished" podID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerID="de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84" exitCode=0 Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.273397 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerDied","Data":"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84"} Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.273427 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnfnz" event={"ID":"7cee91b2-5208-4643-bfff-fa66cbf8c113","Type":"ContainerDied","Data":"3637bebcb573322245374ef4e96195470e94ac8780fff3c6db70994239001ab7"} Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.273434 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnfnz" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.273448 4747 scope.go:117] "RemoveContainer" containerID="de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.303106 4747 scope.go:117] "RemoveContainer" containerID="f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.311859 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.322184 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cnfnz"] Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.328105 4747 scope.go:117] "RemoveContainer" containerID="b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.384301 4747 scope.go:117] "RemoveContainer" containerID="de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84" Dec 05 22:57:34 crc kubenswrapper[4747]: E1205 22:57:34.385532 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84\": container with ID starting with de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84 not found: ID does not exist" containerID="de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.385569 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84"} err="failed to get container status \"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84\": rpc error: code = NotFound desc = could not find container \"de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84\": container with ID starting with de09d80814658bd0857094975fe19cda6ff508ccd17b74c1fbca976b7b933f84 not found: ID does not exist" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.385610 4747 scope.go:117] "RemoveContainer" containerID="f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae" Dec 05 22:57:34 crc kubenswrapper[4747]: E1205 22:57:34.385865 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae\": container with ID starting with f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae not found: ID does not exist" containerID="f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.385894 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae"} err="failed to get container status \"f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae\": rpc error: code = NotFound desc = could not find container \"f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae\": container with ID starting with f5e500af1b5ab4115b291c7e08d4ad115680ceb2d03503cd012c0694259eb3ae not found: ID does not exist" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.385911 4747 scope.go:117] "RemoveContainer" containerID="b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239" Dec 05 22:57:34 crc kubenswrapper[4747]: E1205 22:57:34.386120 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239\": container with ID starting with b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239 not found: ID does not exist" containerID="b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239" Dec 05 22:57:34 crc kubenswrapper[4747]: I1205 22:57:34.386148 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239"} err="failed to get container status \"b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239\": rpc error: code = NotFound desc = could not find container \"b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239\": container with ID starting with b61aa529b66fac450d0058952cb7f914121de1e125ea2a76cfd2d3d97120a239 not found: ID does not exist" Dec 05 22:57:35 crc kubenswrapper[4747]: I1205 22:57:35.868864 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" path="/var/lib/kubelet/pods/7cee91b2-5208-4643-bfff-fa66cbf8c113/volumes" Dec 05 22:57:38 crc kubenswrapper[4747]: I1205 22:57:38.368978 4747 scope.go:117] "RemoveContainer" containerID="e89ce65890bef5b48387b8b029edf66c27acd1c8b466786a06c541a537dc66c0" Dec 05 22:57:38 crc kubenswrapper[4747]: I1205 22:57:38.396007 4747 scope.go:117] "RemoveContainer" containerID="ad953ea048ec2b2126ea78cc592817b2e534e6d320274a022381b6930e0635c7" Dec 05 22:57:38 crc kubenswrapper[4747]: I1205 22:57:38.431410 4747 scope.go:117] "RemoveContainer" containerID="d943d5570cb1f634c3514caf2ddeefcfa4f1dd9cfd0d8b577f2fc385a557b479" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.950177 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:57:46 crc kubenswrapper[4747]: E1205 22:57:46.951230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="extract-utilities" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.951246 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="extract-utilities" Dec 05 22:57:46 crc kubenswrapper[4747]: E1205 22:57:46.951299 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="extract-content" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.951309 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="extract-content" Dec 05 22:57:46 crc kubenswrapper[4747]: E1205 22:57:46.951323 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="registry-server" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.951332 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="registry-server" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.951612 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cee91b2-5208-4643-bfff-fa66cbf8c113" containerName="registry-server" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.953468 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:46 crc kubenswrapper[4747]: I1205 22:57:46.962163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.065960 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7k8l\" (UniqueName: \"kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.066144 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.066179 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.168119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7k8l\" (UniqueName: \"kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.168234 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.168260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.168688 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.168713 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.187869 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7k8l\" (UniqueName: \"kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l\") pod \"redhat-marketplace-4qjtq\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.274317 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:47 crc kubenswrapper[4747]: I1205 22:57:47.764418 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:57:48 crc kubenswrapper[4747]: I1205 22:57:48.463734 4747 generic.go:334] "Generic (PLEG): container finished" podID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerID="e4ec7eb8771314af2828648fba39363d747f670140a3fc305c75395275ce4e0b" exitCode=0 Dec 05 22:57:48 crc kubenswrapper[4747]: I1205 22:57:48.463832 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerDied","Data":"e4ec7eb8771314af2828648fba39363d747f670140a3fc305c75395275ce4e0b"} Dec 05 22:57:48 crc kubenswrapper[4747]: I1205 22:57:48.464262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerStarted","Data":"29c20071ac0654dd3038b01b95a623bf021323da3749e3edaeee2278db38d3f6"} Dec 05 22:57:49 crc kubenswrapper[4747]: I1205 22:57:49.473496 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerStarted","Data":"469d2f6d4c661f2c88e28d9f186515e3b5879d9266c3ea877f0ad3fd7535a2fe"} Dec 05 22:57:50 crc kubenswrapper[4747]: I1205 22:57:50.497430 4747 generic.go:334] "Generic (PLEG): container finished" podID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerID="469d2f6d4c661f2c88e28d9f186515e3b5879d9266c3ea877f0ad3fd7535a2fe" exitCode=0 Dec 05 22:57:50 crc kubenswrapper[4747]: I1205 22:57:50.497519 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerDied","Data":"469d2f6d4c661f2c88e28d9f186515e3b5879d9266c3ea877f0ad3fd7535a2fe"} Dec 05 22:57:51 crc kubenswrapper[4747]: I1205 22:57:51.511326 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerStarted","Data":"7208b58e555fe8828dff9c938096e9a97980d7a4283f17c33553946500e83a66"} Dec 05 22:57:51 crc kubenswrapper[4747]: I1205 22:57:51.545747 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qjtq" podStartSLOduration=3.073785876 podStartE2EDuration="5.545726533s" podCreationTimestamp="2025-12-05 22:57:46 +0000 UTC" firstStartedPulling="2025-12-05 22:57:48.466787504 +0000 UTC m=+8138.934095002" lastFinishedPulling="2025-12-05 22:57:50.938728161 +0000 UTC m=+8141.406035659" observedRunningTime="2025-12-05 22:57:51.532798881 +0000 UTC m=+8142.000106419" watchObservedRunningTime="2025-12-05 22:57:51.545726533 +0000 UTC m=+8142.013034031" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.765670 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.770440 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.775328 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.871738 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.871812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.871885 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtrt\" (UniqueName: \"kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.973525 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtrt\" (UniqueName: \"kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.973974 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.974123 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.974566 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.974689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:55 crc kubenswrapper[4747]: I1205 22:57:55.997792 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtrt\" (UniqueName: \"kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt\") pod \"redhat-operators-nzgwh\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:56 crc kubenswrapper[4747]: I1205 22:57:56.098226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:57:56 crc kubenswrapper[4747]: I1205 22:57:56.602732 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:57:56 crc kubenswrapper[4747]: W1205 22:57:56.604913 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5577229_374f_4928_ad1e_5758ee08d537.slice/crio-af792bfa76bbb6d1e6a641fdcb3050c0954389fcc2e78e2fb76413509b98c91b WatchSource:0}: Error finding container af792bfa76bbb6d1e6a641fdcb3050c0954389fcc2e78e2fb76413509b98c91b: Status 404 returned error can't find the container with id af792bfa76bbb6d1e6a641fdcb3050c0954389fcc2e78e2fb76413509b98c91b Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.274523 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.274888 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.329048 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.578746 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5577229-374f-4928-ad1e-5758ee08d537" containerID="0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957" exitCode=0 Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.578813 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerDied","Data":"0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957"} Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.579166 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerStarted","Data":"af792bfa76bbb6d1e6a641fdcb3050c0954389fcc2e78e2fb76413509b98c91b"} Dec 05 22:57:57 crc kubenswrapper[4747]: I1205 22:57:57.673750 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:57:58 crc kubenswrapper[4747]: I1205 22:57:58.341909 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:57:58 crc kubenswrapper[4747]: I1205 22:57:58.594478 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerStarted","Data":"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb"} Dec 05 22:57:59 crc kubenswrapper[4747]: I1205 22:57:59.602162 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qjtq" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="registry-server" containerID="cri-o://7208b58e555fe8828dff9c938096e9a97980d7a4283f17c33553946500e83a66" gracePeriod=2 Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.631152 4747 generic.go:334] "Generic (PLEG): container finished" podID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerID="7208b58e555fe8828dff9c938096e9a97980d7a4283f17c33553946500e83a66" exitCode=0 Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.631264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerDied","Data":"7208b58e555fe8828dff9c938096e9a97980d7a4283f17c33553946500e83a66"} Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.767432 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.852018 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6f55995b6-9n2p4" podUID="ca9730ab-ce9c-4f56-a81e-14c78ac858ab" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.892437 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities\") pod \"89cf8109-40ad-4844-9540-b82db1b37c7a\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.892539 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content\") pod \"89cf8109-40ad-4844-9540-b82db1b37c7a\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.892700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7k8l\" (UniqueName: \"kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l\") pod \"89cf8109-40ad-4844-9540-b82db1b37c7a\" (UID: \"89cf8109-40ad-4844-9540-b82db1b37c7a\") " Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.893837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities" (OuterVolumeSpecName: "utilities") pod "89cf8109-40ad-4844-9540-b82db1b37c7a" (UID: "89cf8109-40ad-4844-9540-b82db1b37c7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.904781 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l" (OuterVolumeSpecName: "kube-api-access-v7k8l") pod "89cf8109-40ad-4844-9540-b82db1b37c7a" (UID: "89cf8109-40ad-4844-9540-b82db1b37c7a"). InnerVolumeSpecName "kube-api-access-v7k8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.910838 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89cf8109-40ad-4844-9540-b82db1b37c7a" (UID: "89cf8109-40ad-4844-9540-b82db1b37c7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.996273 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7k8l\" (UniqueName: \"kubernetes.io/projected/89cf8109-40ad-4844-9540-b82db1b37c7a-kube-api-access-v7k8l\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.996336 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:00 crc kubenswrapper[4747]: I1205 22:58:00.996357 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf8109-40ad-4844-9540-b82db1b37c7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.647393 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qjtq" Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.647412 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qjtq" event={"ID":"89cf8109-40ad-4844-9540-b82db1b37c7a","Type":"ContainerDied","Data":"29c20071ac0654dd3038b01b95a623bf021323da3749e3edaeee2278db38d3f6"} Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.648031 4747 scope.go:117] "RemoveContainer" containerID="7208b58e555fe8828dff9c938096e9a97980d7a4283f17c33553946500e83a66" Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.651309 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5577229-374f-4928-ad1e-5758ee08d537" containerID="ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb" exitCode=0 Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.651361 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerDied","Data":"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb"} Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.678629 4747 scope.go:117] "RemoveContainer" containerID="469d2f6d4c661f2c88e28d9f186515e3b5879d9266c3ea877f0ad3fd7535a2fe" Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.707685 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.718114 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qjtq"] Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.732597 4747 scope.go:117] "RemoveContainer" containerID="e4ec7eb8771314af2828648fba39363d747f670140a3fc305c75395275ce4e0b" Dec 05 22:58:01 crc kubenswrapper[4747]: I1205 22:58:01.855869 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" path="/var/lib/kubelet/pods/89cf8109-40ad-4844-9540-b82db1b37c7a/volumes" Dec 05 22:58:02 crc kubenswrapper[4747]: I1205 22:58:02.671375 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerStarted","Data":"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af"} Dec 05 22:58:02 crc kubenswrapper[4747]: I1205 22:58:02.704027 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzgwh" podStartSLOduration=3.210807033 podStartE2EDuration="7.704008041s" podCreationTimestamp="2025-12-05 22:57:55 +0000 UTC" firstStartedPulling="2025-12-05 22:57:57.581886061 +0000 UTC m=+8148.049193589" lastFinishedPulling="2025-12-05 22:58:02.075087109 +0000 UTC m=+8152.542394597" observedRunningTime="2025-12-05 22:58:02.694603943 +0000 UTC m=+8153.161911441" watchObservedRunningTime="2025-12-05 22:58:02.704008041 +0000 UTC m=+8153.171315539" Dec 05 22:58:06 crc kubenswrapper[4747]: I1205 22:58:06.098763 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:06 crc kubenswrapper[4747]: I1205 22:58:06.099357 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:07 crc kubenswrapper[4747]: I1205 22:58:07.169102 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzgwh" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="registry-server" probeResult="failure" output=< Dec 05 22:58:07 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 22:58:07 crc kubenswrapper[4747]: > Dec 05 22:58:16 crc kubenswrapper[4747]: I1205 22:58:16.148198 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:16 crc kubenswrapper[4747]: I1205 22:58:16.204029 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:16 crc kubenswrapper[4747]: I1205 22:58:16.395374 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:58:17 crc kubenswrapper[4747]: I1205 22:58:17.838982 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzgwh" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="registry-server" containerID="cri-o://098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af" gracePeriod=2 Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.318549 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.394163 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndtrt\" (UniqueName: \"kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt\") pod \"e5577229-374f-4928-ad1e-5758ee08d537\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.394328 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities\") pod \"e5577229-374f-4928-ad1e-5758ee08d537\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.394496 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content\") pod \"e5577229-374f-4928-ad1e-5758ee08d537\" (UID: \"e5577229-374f-4928-ad1e-5758ee08d537\") " Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.395709 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities" (OuterVolumeSpecName: "utilities") pod "e5577229-374f-4928-ad1e-5758ee08d537" (UID: "e5577229-374f-4928-ad1e-5758ee08d537"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.405293 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt" (OuterVolumeSpecName: "kube-api-access-ndtrt") pod "e5577229-374f-4928-ad1e-5758ee08d537" (UID: "e5577229-374f-4928-ad1e-5758ee08d537"). InnerVolumeSpecName "kube-api-access-ndtrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.497467 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndtrt\" (UniqueName: \"kubernetes.io/projected/e5577229-374f-4928-ad1e-5758ee08d537-kube-api-access-ndtrt\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.497775 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.539008 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5577229-374f-4928-ad1e-5758ee08d537" (UID: "e5577229-374f-4928-ad1e-5758ee08d537"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.600070 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5577229-374f-4928-ad1e-5758ee08d537-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.850705 4747 generic.go:334] "Generic (PLEG): container finished" podID="e5577229-374f-4928-ad1e-5758ee08d537" containerID="098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af" exitCode=0 Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.850757 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerDied","Data":"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af"} Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.850788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzgwh" event={"ID":"e5577229-374f-4928-ad1e-5758ee08d537","Type":"ContainerDied","Data":"af792bfa76bbb6d1e6a641fdcb3050c0954389fcc2e78e2fb76413509b98c91b"} Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.850809 4747 scope.go:117] "RemoveContainer" containerID="098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.850966 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzgwh" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.882414 4747 scope.go:117] "RemoveContainer" containerID="ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.896727 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.908294 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzgwh"] Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.923695 4747 scope.go:117] "RemoveContainer" containerID="0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.983064 4747 scope.go:117] "RemoveContainer" containerID="098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af" Dec 05 22:58:18 crc kubenswrapper[4747]: E1205 22:58:18.983778 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af\": container with ID starting with 098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af not found: ID does not exist" containerID="098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.983808 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af"} err="failed to get container status \"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af\": rpc error: code = NotFound desc = could not find container \"098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af\": container with ID starting with 098db4aca3ee404d8fe6f5e281660ac32fae2c27676599d4ee3e7cc5c30be7af not found: ID does not exist" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.983828 4747 scope.go:117] "RemoveContainer" containerID="ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb" Dec 05 22:58:18 crc kubenswrapper[4747]: E1205 22:58:18.984050 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb\": container with ID starting with ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb not found: ID does not exist" containerID="ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.984067 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb"} err="failed to get container status \"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb\": rpc error: code = NotFound desc = could not find container \"ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb\": container with ID starting with ef5426004702dda349258eee2ab9cdde53facf2136203a3b6a51668ef709d4bb not found: ID does not exist" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.984084 4747 scope.go:117] "RemoveContainer" containerID="0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957" Dec 05 22:58:18 crc kubenswrapper[4747]: E1205 22:58:18.984549 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957\": container with ID starting with 0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957 not found: ID does not exist" containerID="0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957" Dec 05 22:58:18 crc kubenswrapper[4747]: I1205 22:58:18.984575 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957"} err="failed to get container status \"0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957\": rpc error: code = NotFound desc = could not find container \"0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957\": container with ID starting with 0e3ac364612ee2b60c3d5270db32d39a59c70076200363521021bab076f5a957 not found: ID does not exist" Dec 05 22:58:19 crc kubenswrapper[4747]: I1205 22:58:19.855265 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5577229-374f-4928-ad1e-5758ee08d537" path="/var/lib/kubelet/pods/e5577229-374f-4928-ad1e-5758ee08d537/volumes" Dec 05 22:59:06 crc kubenswrapper[4747]: I1205 22:59:06.222362 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:59:06 crc kubenswrapper[4747]: I1205 22:59:06.223076 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:59:36 crc kubenswrapper[4747]: I1205 22:59:36.221799 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 22:59:36 crc kubenswrapper[4747]: I1205 22:59:36.222448 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 22:59:36 crc kubenswrapper[4747]: I1205 22:59:36.680781 4747 generic.go:334] "Generic (PLEG): container finished" podID="1a42be6c-26a1-4081-8df5-9b5ee5d45262" containerID="0b904178130b4c9272a68bc3a218c840d9a829a86a774a7ba6d220165769828f" exitCode=0 Dec 05 22:59:36 crc kubenswrapper[4747]: I1205 22:59:36.680943 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-5z757" event={"ID":"1a42be6c-26a1-4081-8df5-9b5ee5d45262","Type":"ContainerDied","Data":"0b904178130b4c9272a68bc3a218c840d9a829a86a774a7ba6d220165769828f"} Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.158333 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.210377 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0\") pod \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.210531 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory\") pod \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.210610 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle\") pod \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.210643 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key\") pod \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.210681 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df552\" (UniqueName: \"kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552\") pod \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\" (UID: \"1a42be6c-26a1-4081-8df5-9b5ee5d45262\") " Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.216980 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552" (OuterVolumeSpecName: "kube-api-access-df552") pod "1a42be6c-26a1-4081-8df5-9b5ee5d45262" (UID: "1a42be6c-26a1-4081-8df5-9b5ee5d45262"). InnerVolumeSpecName "kube-api-access-df552". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.217783 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1a42be6c-26a1-4081-8df5-9b5ee5d45262" (UID: "1a42be6c-26a1-4081-8df5-9b5ee5d45262"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.259950 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory" (OuterVolumeSpecName: "inventory") pod "1a42be6c-26a1-4081-8df5-9b5ee5d45262" (UID: "1a42be6c-26a1-4081-8df5-9b5ee5d45262"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.260794 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a42be6c-26a1-4081-8df5-9b5ee5d45262" (UID: "1a42be6c-26a1-4081-8df5-9b5ee5d45262"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.262740 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "1a42be6c-26a1-4081-8df5-9b5ee5d45262" (UID: "1a42be6c-26a1-4081-8df5-9b5ee5d45262"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.314098 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.314135 4747 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.314149 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.314185 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df552\" (UniqueName: \"kubernetes.io/projected/1a42be6c-26a1-4081-8df5-9b5ee5d45262-kube-api-access-df552\") on node \"crc\" DevicePath \"\"" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.314199 4747 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1a42be6c-26a1-4081-8df5-9b5ee5d45262-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.709649 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-5z757" event={"ID":"1a42be6c-26a1-4081-8df5-9b5ee5d45262","Type":"ContainerDied","Data":"bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db"} Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.709728 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbd1891e7fb974c7bc8eb148eec0e1aa1168497ff1ab1d25dec2db39e1588db" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.709806 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-5z757" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851061 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-cx4rm"] Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851537 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851555 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851572 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="extract-utilities" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851598 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="extract-utilities" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851630 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="extract-utilities" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851641 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="extract-utilities" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851655 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="extract-content" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851661 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="extract-content" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851673 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851678 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851702 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a42be6c-26a1-4081-8df5-9b5ee5d45262" containerName="libvirt-openstack-openstack-cell1" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851709 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a42be6c-26a1-4081-8df5-9b5ee5d45262" containerName="libvirt-openstack-openstack-cell1" Dec 05 22:59:38 crc kubenswrapper[4747]: E1205 22:59:38.851721 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="extract-content" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851727 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="extract-content" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851946 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a42be6c-26a1-4081-8df5-9b5ee5d45262" containerName="libvirt-openstack-openstack-cell1" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851964 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cf8109-40ad-4844-9540-b82db1b37c7a" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.851985 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5577229-374f-4928-ad1e-5758ee08d537" containerName="registry-server" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.852739 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.860223 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-cx4rm"] Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.908747 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.908987 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.909140 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.909235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.909291 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.909410 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.910008 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.927776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.927852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnqlt\" (UniqueName: \"kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.928003 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.928176 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.929323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.930041 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.930099 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.930211 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:38 crc kubenswrapper[4747]: I1205 22:59:38.930297 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.031986 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032148 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032219 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnqlt\" (UniqueName: \"kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032270 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032339 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.032367 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.035049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.037510 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.037830 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.037861 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.038073 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.038248 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.045204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.045456 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.049446 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnqlt\" (UniqueName: \"kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt\") pod \"nova-cell1-openstack-openstack-cell1-cx4rm\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.224868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 22:59:39 crc kubenswrapper[4747]: I1205 22:59:39.791074 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-cx4rm"] Dec 05 22:59:39 crc kubenswrapper[4747]: W1205 22:59:39.795221 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0805aa11_5ab8_4ab4_b0c2_0435e7cc68e5.slice/crio-54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41 WatchSource:0}: Error finding container 54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41: Status 404 returned error can't find the container with id 54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41 Dec 05 22:59:40 crc kubenswrapper[4747]: I1205 22:59:40.736578 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" event={"ID":"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5","Type":"ContainerStarted","Data":"7f11852b1f5056af3673af3dcc1cc36fce1516229d2f6ef67fd4b36dec2810fa"} Dec 05 22:59:40 crc kubenswrapper[4747]: I1205 22:59:40.737163 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" event={"ID":"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5","Type":"ContainerStarted","Data":"54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41"} Dec 05 22:59:40 crc kubenswrapper[4747]: I1205 22:59:40.764006 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" podStartSLOduration=2.303346939 podStartE2EDuration="2.763976035s" podCreationTimestamp="2025-12-05 22:59:38 +0000 UTC" firstStartedPulling="2025-12-05 22:59:39.797193065 +0000 UTC m=+8250.264500553" lastFinishedPulling="2025-12-05 22:59:40.257822121 +0000 UTC m=+8250.725129649" observedRunningTime="2025-12-05 22:59:40.758260687 +0000 UTC m=+8251.225568265" watchObservedRunningTime="2025-12-05 22:59:40.763976035 +0000 UTC m=+8251.231283573" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.172304 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt"] Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.178175 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.181698 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.182699 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.191271 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt"] Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.290084 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.290151 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngttm\" (UniqueName: \"kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.290172 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.391870 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.392930 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.393968 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngttm\" (UniqueName: \"kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.396519 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.404399 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.424282 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngttm\" (UniqueName: \"kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm\") pod \"collect-profiles-29416260-ftnbt\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.504339 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:00 crc kubenswrapper[4747]: I1205 23:00:00.974020 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt"] Dec 05 23:00:00 crc kubenswrapper[4747]: W1205 23:00:00.982019 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ab5f79_2c5c_4bdf_a82e_3aa50f99bbd2.slice/crio-4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed WatchSource:0}: Error finding container 4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed: Status 404 returned error can't find the container with id 4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed Dec 05 23:00:01 crc kubenswrapper[4747]: I1205 23:00:01.990867 4747 generic.go:334] "Generic (PLEG): container finished" podID="13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" containerID="ef02589ed7d525e2a2a1f2b72dd1d6fdcb97e11f0039a5282db11cb01f5c928f" exitCode=0 Dec 05 23:00:01 crc kubenswrapper[4747]: I1205 23:00:01.990918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" event={"ID":"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2","Type":"ContainerDied","Data":"ef02589ed7d525e2a2a1f2b72dd1d6fdcb97e11f0039a5282db11cb01f5c928f"} Dec 05 23:00:01 crc kubenswrapper[4747]: I1205 23:00:01.990953 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" event={"ID":"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2","Type":"ContainerStarted","Data":"4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed"} Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.372061 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.478128 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngttm\" (UniqueName: \"kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm\") pod \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.478871 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume\") pod \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.478922 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume\") pod \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\" (UID: \"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2\") " Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.479678 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume" (OuterVolumeSpecName: "config-volume") pod "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" (UID: "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.483954 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" (UID: "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.484100 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm" (OuterVolumeSpecName: "kube-api-access-ngttm") pod "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" (UID: "13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2"). InnerVolumeSpecName "kube-api-access-ngttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.581234 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.581268 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:00:03 crc kubenswrapper[4747]: I1205 23:00:03.581277 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngttm\" (UniqueName: \"kubernetes.io/projected/13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2-kube-api-access-ngttm\") on node \"crc\" DevicePath \"\"" Dec 05 23:00:04 crc kubenswrapper[4747]: I1205 23:00:04.011411 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" event={"ID":"13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2","Type":"ContainerDied","Data":"4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed"} Dec 05 23:00:04 crc kubenswrapper[4747]: I1205 23:00:04.011464 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b512df1e6e27386b92b4aca5b4114ef727cc39229f0aa9aa6c4c70388b633ed" Dec 05 23:00:04 crc kubenswrapper[4747]: I1205 23:00:04.011528 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416260-ftnbt" Dec 05 23:00:04 crc kubenswrapper[4747]: I1205 23:00:04.461108 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg"] Dec 05 23:00:04 crc kubenswrapper[4747]: I1205 23:00:04.471827 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416215-sc9tg"] Dec 05 23:00:05 crc kubenswrapper[4747]: I1205 23:00:05.866277 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e852ab93-76f8-4d63-83fd-527104dd25df" path="/var/lib/kubelet/pods/e852ab93-76f8-4d63-83fd-527104dd25df/volumes" Dec 05 23:00:06 crc kubenswrapper[4747]: I1205 23:00:06.222164 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:00:06 crc kubenswrapper[4747]: I1205 23:00:06.222480 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:00:06 crc kubenswrapper[4747]: I1205 23:00:06.222531 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:00:06 crc kubenswrapper[4747]: I1205 23:00:06.223478 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:00:06 crc kubenswrapper[4747]: I1205 23:00:06.223594 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc" gracePeriod=600 Dec 05 23:00:07 crc kubenswrapper[4747]: I1205 23:00:07.048371 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc" exitCode=0 Dec 05 23:00:07 crc kubenswrapper[4747]: I1205 23:00:07.048450 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc"} Dec 05 23:00:07 crc kubenswrapper[4747]: I1205 23:00:07.048941 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2"} Dec 05 23:00:07 crc kubenswrapper[4747]: I1205 23:00:07.048970 4747 scope.go:117] "RemoveContainer" containerID="877c9883417b1154a746cbd73ecd4529a1deb45eb2f193381678cd8d31c5fb86" Dec 05 23:00:38 crc kubenswrapper[4747]: I1205 23:00:38.686862 4747 scope.go:117] "RemoveContainer" containerID="c72c42b9237c2a4676afef7888bc32df25cf4f82b9c184b879f95370e2bfec6c" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.159145 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416261-kkkjb"] Dec 05 23:01:00 crc kubenswrapper[4747]: E1205 23:01:00.160644 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" containerName="collect-profiles" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.160662 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" containerName="collect-profiles" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.161157 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ab5f79-2c5c-4bdf-a82e-3aa50f99bbd2" containerName="collect-profiles" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.162528 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.173315 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416261-kkkjb"] Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.287463 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5zp\" (UniqueName: \"kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.287509 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.287565 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.287631 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.389154 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.389268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.389430 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5zp\" (UniqueName: \"kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.389476 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.396274 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.396739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.399782 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.406306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5zp\" (UniqueName: \"kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp\") pod \"keystone-cron-29416261-kkkjb\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.503662 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:00 crc kubenswrapper[4747]: I1205 23:01:00.928697 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416261-kkkjb"] Dec 05 23:01:01 crc kubenswrapper[4747]: I1205 23:01:01.636702 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416261-kkkjb" event={"ID":"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3","Type":"ContainerStarted","Data":"ac47f6a53b1bc2e726ae9cd9142ab3adfa0e763b644747ec03f83ad2164f1915"} Dec 05 23:01:01 crc kubenswrapper[4747]: I1205 23:01:01.637045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416261-kkkjb" event={"ID":"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3","Type":"ContainerStarted","Data":"550df0584995b11043b178d4bd3ed57b1f2347c7e89669cf3e76d21c452a9164"} Dec 05 23:01:01 crc kubenswrapper[4747]: I1205 23:01:01.654007 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416261-kkkjb" podStartSLOduration=1.653989946 podStartE2EDuration="1.653989946s" podCreationTimestamp="2025-12-05 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:01:01.649478747 +0000 UTC m=+8332.116786225" watchObservedRunningTime="2025-12-05 23:01:01.653989946 +0000 UTC m=+8332.121297434" Dec 05 23:01:03 crc kubenswrapper[4747]: I1205 23:01:03.656130 4747 generic.go:334] "Generic (PLEG): container finished" podID="600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" containerID="ac47f6a53b1bc2e726ae9cd9142ab3adfa0e763b644747ec03f83ad2164f1915" exitCode=0 Dec 05 23:01:03 crc kubenswrapper[4747]: I1205 23:01:03.656236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416261-kkkjb" event={"ID":"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3","Type":"ContainerDied","Data":"ac47f6a53b1bc2e726ae9cd9142ab3adfa0e763b644747ec03f83ad2164f1915"} Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.065006 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.228319 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys\") pod \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.228390 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data\") pod \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.228450 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5zp\" (UniqueName: \"kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp\") pod \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.228751 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle\") pod \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\" (UID: \"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3\") " Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.233711 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" (UID: "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.234738 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp" (OuterVolumeSpecName: "kube-api-access-br5zp") pod "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" (UID: "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3"). InnerVolumeSpecName "kube-api-access-br5zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.265266 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" (UID: "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.282889 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data" (OuterVolumeSpecName: "config-data") pod "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" (UID: "600ea9b0-d6cc-48a4-84ce-ba78b2013bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.331975 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.332066 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.332085 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.332103 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5zp\" (UniqueName: \"kubernetes.io/projected/600ea9b0-d6cc-48a4-84ce-ba78b2013bb3-kube-api-access-br5zp\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.685931 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416261-kkkjb" event={"ID":"600ea9b0-d6cc-48a4-84ce-ba78b2013bb3","Type":"ContainerDied","Data":"550df0584995b11043b178d4bd3ed57b1f2347c7e89669cf3e76d21c452a9164"} Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.686028 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550df0584995b11043b178d4bd3ed57b1f2347c7e89669cf3e76d21c452a9164" Dec 05 23:01:05 crc kubenswrapper[4747]: I1205 23:01:05.686117 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416261-kkkjb" Dec 05 23:01:38 crc kubenswrapper[4747]: I1205 23:01:38.938990 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:38 crc kubenswrapper[4747]: E1205 23:01:38.939998 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" containerName="keystone-cron" Dec 05 23:01:38 crc kubenswrapper[4747]: I1205 23:01:38.940017 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" containerName="keystone-cron" Dec 05 23:01:38 crc kubenswrapper[4747]: I1205 23:01:38.940214 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="600ea9b0-d6cc-48a4-84ce-ba78b2013bb3" containerName="keystone-cron" Dec 05 23:01:38 crc kubenswrapper[4747]: I1205 23:01:38.941902 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:38 crc kubenswrapper[4747]: I1205 23:01:38.963821 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.062387 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clp4x\" (UniqueName: \"kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.062440 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.062467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.164244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clp4x\" (UniqueName: \"kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.164316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.164344 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.164966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.165004 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.191236 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clp4x\" (UniqueName: \"kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x\") pod \"certified-operators-2nbbg\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.261808 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:39 crc kubenswrapper[4747]: I1205 23:01:39.813526 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:40 crc kubenswrapper[4747]: I1205 23:01:40.578072 4747 generic.go:334] "Generic (PLEG): container finished" podID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerID="b1e6dfd7f4c0b3f5c85951feb23c20c667ed7251a12eb6759dcdfbc403e25046" exitCode=0 Dec 05 23:01:40 crc kubenswrapper[4747]: I1205 23:01:40.578187 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerDied","Data":"b1e6dfd7f4c0b3f5c85951feb23c20c667ed7251a12eb6759dcdfbc403e25046"} Dec 05 23:01:40 crc kubenswrapper[4747]: I1205 23:01:40.578576 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerStarted","Data":"23b541fd2335834fe1341f699e249f1eca55d45ef5db806fe0a06e7539f69858"} Dec 05 23:01:40 crc kubenswrapper[4747]: I1205 23:01:40.583409 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:01:41 crc kubenswrapper[4747]: I1205 23:01:41.589984 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerStarted","Data":"1c00ca8fa1fecf7ca174e42315db8c59f0d27029d07030fe500f39fa3427b981"} Dec 05 23:01:42 crc kubenswrapper[4747]: I1205 23:01:42.603540 4747 generic.go:334] "Generic (PLEG): container finished" podID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerID="1c00ca8fa1fecf7ca174e42315db8c59f0d27029d07030fe500f39fa3427b981" exitCode=0 Dec 05 23:01:42 crc kubenswrapper[4747]: I1205 23:01:42.603629 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerDied","Data":"1c00ca8fa1fecf7ca174e42315db8c59f0d27029d07030fe500f39fa3427b981"} Dec 05 23:01:43 crc kubenswrapper[4747]: I1205 23:01:43.619657 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerStarted","Data":"a8974e81447a620e91b96df575a74e01bd12507674253cefe288792cda7670a4"} Dec 05 23:01:43 crc kubenswrapper[4747]: I1205 23:01:43.644684 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2nbbg" podStartSLOduration=3.203769536 podStartE2EDuration="5.644663933s" podCreationTimestamp="2025-12-05 23:01:38 +0000 UTC" firstStartedPulling="2025-12-05 23:01:40.582417577 +0000 UTC m=+8371.049725065" lastFinishedPulling="2025-12-05 23:01:43.023311934 +0000 UTC m=+8373.490619462" observedRunningTime="2025-12-05 23:01:43.63877186 +0000 UTC m=+8374.106079368" watchObservedRunningTime="2025-12-05 23:01:43.644663933 +0000 UTC m=+8374.111971431" Dec 05 23:01:49 crc kubenswrapper[4747]: I1205 23:01:49.262872 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:49 crc kubenswrapper[4747]: I1205 23:01:49.263637 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:49 crc kubenswrapper[4747]: I1205 23:01:49.359601 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:49 crc kubenswrapper[4747]: I1205 23:01:49.801827 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:49 crc kubenswrapper[4747]: I1205 23:01:49.875869 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:51 crc kubenswrapper[4747]: I1205 23:01:51.732912 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2nbbg" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="registry-server" containerID="cri-o://a8974e81447a620e91b96df575a74e01bd12507674253cefe288792cda7670a4" gracePeriod=2 Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.769456 4747 generic.go:334] "Generic (PLEG): container finished" podID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerID="a8974e81447a620e91b96df575a74e01bd12507674253cefe288792cda7670a4" exitCode=0 Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.769544 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerDied","Data":"a8974e81447a620e91b96df575a74e01bd12507674253cefe288792cda7670a4"} Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.770367 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbbg" event={"ID":"45d592de-9c2d-4f10-bf0b-1c005788f305","Type":"ContainerDied","Data":"23b541fd2335834fe1341f699e249f1eca55d45ef5db806fe0a06e7539f69858"} Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.770392 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b541fd2335834fe1341f699e249f1eca55d45ef5db806fe0a06e7539f69858" Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.798778 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.907912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clp4x\" (UniqueName: \"kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x\") pod \"45d592de-9c2d-4f10-bf0b-1c005788f305\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.908205 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities\") pod \"45d592de-9c2d-4f10-bf0b-1c005788f305\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.908278 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content\") pod \"45d592de-9c2d-4f10-bf0b-1c005788f305\" (UID: \"45d592de-9c2d-4f10-bf0b-1c005788f305\") " Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.909008 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities" (OuterVolumeSpecName: "utilities") pod "45d592de-9c2d-4f10-bf0b-1c005788f305" (UID: "45d592de-9c2d-4f10-bf0b-1c005788f305"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.909331 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.915204 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x" (OuterVolumeSpecName: "kube-api-access-clp4x") pod "45d592de-9c2d-4f10-bf0b-1c005788f305" (UID: "45d592de-9c2d-4f10-bf0b-1c005788f305"). InnerVolumeSpecName "kube-api-access-clp4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:01:52 crc kubenswrapper[4747]: I1205 23:01:52.957285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d592de-9c2d-4f10-bf0b-1c005788f305" (UID: "45d592de-9c2d-4f10-bf0b-1c005788f305"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:01:53 crc kubenswrapper[4747]: I1205 23:01:53.011981 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clp4x\" (UniqueName: \"kubernetes.io/projected/45d592de-9c2d-4f10-bf0b-1c005788f305-kube-api-access-clp4x\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:53 crc kubenswrapper[4747]: I1205 23:01:53.012015 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d592de-9c2d-4f10-bf0b-1c005788f305-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:01:53 crc kubenswrapper[4747]: I1205 23:01:53.791657 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbbg" Dec 05 23:01:53 crc kubenswrapper[4747]: I1205 23:01:53.835999 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:53 crc kubenswrapper[4747]: I1205 23:01:53.854191 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2nbbg"] Dec 05 23:01:55 crc kubenswrapper[4747]: I1205 23:01:55.858866 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" path="/var/lib/kubelet/pods/45d592de-9c2d-4f10-bf0b-1c005788f305/volumes" Dec 05 23:02:06 crc kubenswrapper[4747]: I1205 23:02:06.241912 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:02:06 crc kubenswrapper[4747]: I1205 23:02:06.242828 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:02:36 crc kubenswrapper[4747]: I1205 23:02:36.222280 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:02:36 crc kubenswrapper[4747]: I1205 23:02:36.223127 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:02:58 crc kubenswrapper[4747]: I1205 23:02:58.622052 4747 generic.go:334] "Generic (PLEG): container finished" podID="0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" containerID="7f11852b1f5056af3673af3dcc1cc36fce1516229d2f6ef67fd4b36dec2810fa" exitCode=0 Dec 05 23:02:58 crc kubenswrapper[4747]: I1205 23:02:58.622196 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" event={"ID":"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5","Type":"ContainerDied","Data":"7f11852b1f5056af3673af3dcc1cc36fce1516229d2f6ef67fd4b36dec2810fa"} Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.088254 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.216453 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.216912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217065 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217222 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217293 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217337 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217424 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnqlt\" (UniqueName: \"kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217471 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.217541 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1\") pod \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\" (UID: \"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5\") " Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.225429 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.228797 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt" (OuterVolumeSpecName: "kube-api-access-tnqlt") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "kube-api-access-tnqlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.248450 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.251064 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory" (OuterVolumeSpecName: "inventory") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.253830 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.255469 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.256202 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.258072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.268219 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" (UID: "0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320687 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320742 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320754 4747 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320764 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnqlt\" (UniqueName: \"kubernetes.io/projected/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-kube-api-access-tnqlt\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320774 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320785 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320795 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320804 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.320814 4747 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.647482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" event={"ID":"0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5","Type":"ContainerDied","Data":"54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41"} Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.647523 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a15570ff7888bcb82254b14502f187c6de69e797902f942264ef1947167d41" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.647573 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-cx4rm" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.748533 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-znq8d"] Dec 05 23:03:00 crc kubenswrapper[4747]: E1205 23:03:00.748954 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="extract-content" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.748969 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="extract-content" Dec 05 23:03:00 crc kubenswrapper[4747]: E1205 23:03:00.748991 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="registry-server" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.748997 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="registry-server" Dec 05 23:03:00 crc kubenswrapper[4747]: E1205 23:03:00.749009 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="extract-utilities" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.749015 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="extract-utilities" Dec 05 23:03:00 crc kubenswrapper[4747]: E1205 23:03:00.749033 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.749042 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.749255 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.749273 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d592de-9c2d-4f10-bf0b-1c005788f305" containerName="registry-server" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.750112 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.752264 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.752366 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.756938 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.756960 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.757068 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.767068 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-znq8d"] Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934359 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934436 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934468 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934497 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934635 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55mz\" (UniqueName: \"kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934723 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:00 crc kubenswrapper[4747]: I1205 23:03:00.934802 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.036783 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037127 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037151 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037663 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037699 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55mz\" (UniqueName: \"kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037753 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.037810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.041526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.041537 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.041607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.042414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.045413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.050103 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.060521 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55mz\" (UniqueName: \"kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz\") pod \"telemetry-openstack-openstack-cell1-znq8d\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.068095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:03:01 crc kubenswrapper[4747]: I1205 23:03:01.672524 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-znq8d"] Dec 05 23:03:02 crc kubenswrapper[4747]: I1205 23:03:02.672446 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" event={"ID":"b61e75e7-0701-45f0-a70e-8e660be43224","Type":"ContainerStarted","Data":"2cc123a67f86b9dce7d08c9b805c845b3575977c2425f42bc3524e591d0970d1"} Dec 05 23:03:02 crc kubenswrapper[4747]: I1205 23:03:02.673418 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" event={"ID":"b61e75e7-0701-45f0-a70e-8e660be43224","Type":"ContainerStarted","Data":"2bdc37681dfe101b496f7483b29333d332ecfbbdec638e022aaf880fdca7841f"} Dec 05 23:03:02 crc kubenswrapper[4747]: I1205 23:03:02.705225 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" podStartSLOduration=2.221675157 podStartE2EDuration="2.705205231s" podCreationTimestamp="2025-12-05 23:03:00 +0000 UTC" firstStartedPulling="2025-12-05 23:03:01.688539915 +0000 UTC m=+8452.155847403" lastFinishedPulling="2025-12-05 23:03:02.172069979 +0000 UTC m=+8452.639377477" observedRunningTime="2025-12-05 23:03:02.695273715 +0000 UTC m=+8453.162581223" watchObservedRunningTime="2025-12-05 23:03:02.705205231 +0000 UTC m=+8453.172512739" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.222069 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.222723 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.222800 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.223622 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.223688 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" gracePeriod=600 Dec 05 23:03:06 crc kubenswrapper[4747]: E1205 23:03:06.362664 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.715223 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" exitCode=0 Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.715282 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2"} Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.715319 4747 scope.go:117] "RemoveContainer" containerID="33d0eb4a0ade5e2fba6f646419e1773ce8bfc6d315803141dfef56a51338e0fc" Dec 05 23:03:06 crc kubenswrapper[4747]: I1205 23:03:06.716042 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:03:06 crc kubenswrapper[4747]: E1205 23:03:06.716335 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:03:19 crc kubenswrapper[4747]: I1205 23:03:19.853914 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:03:19 crc kubenswrapper[4747]: E1205 23:03:19.855168 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:03:34 crc kubenswrapper[4747]: I1205 23:03:34.840296 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:03:34 crc kubenswrapper[4747]: E1205 23:03:34.841263 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:03:47 crc kubenswrapper[4747]: I1205 23:03:47.839670 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:03:47 crc kubenswrapper[4747]: E1205 23:03:47.840431 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:04:01 crc kubenswrapper[4747]: I1205 23:04:01.840950 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:04:01 crc kubenswrapper[4747]: E1205 23:04:01.841748 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:04:15 crc kubenswrapper[4747]: I1205 23:04:15.841474 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:04:15 crc kubenswrapper[4747]: E1205 23:04:15.842955 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:04:27 crc kubenswrapper[4747]: I1205 23:04:27.842341 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:04:27 crc kubenswrapper[4747]: E1205 23:04:27.844181 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:04:39 crc kubenswrapper[4747]: I1205 23:04:39.859437 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:04:39 crc kubenswrapper[4747]: E1205 23:04:39.860446 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:04:52 crc kubenswrapper[4747]: I1205 23:04:52.839615 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:04:52 crc kubenswrapper[4747]: E1205 23:04:52.840397 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:05:05 crc kubenswrapper[4747]: I1205 23:05:05.839870 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:05:05 crc kubenswrapper[4747]: E1205 23:05:05.841141 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:05:18 crc kubenswrapper[4747]: I1205 23:05:18.840341 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:05:18 crc kubenswrapper[4747]: E1205 23:05:18.841063 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:05:29 crc kubenswrapper[4747]: I1205 23:05:29.856931 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:05:29 crc kubenswrapper[4747]: E1205 23:05:29.858115 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:05:43 crc kubenswrapper[4747]: I1205 23:05:43.843681 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:05:43 crc kubenswrapper[4747]: E1205 23:05:43.844961 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:05:55 crc kubenswrapper[4747]: I1205 23:05:55.840691 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:05:55 crc kubenswrapper[4747]: E1205 23:05:55.842885 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:06:07 crc kubenswrapper[4747]: I1205 23:06:07.840697 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:06:07 crc kubenswrapper[4747]: E1205 23:06:07.842138 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:06:21 crc kubenswrapper[4747]: I1205 23:06:21.840112 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:06:21 crc kubenswrapper[4747]: E1205 23:06:21.841303 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:06:35 crc kubenswrapper[4747]: I1205 23:06:35.840385 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:06:35 crc kubenswrapper[4747]: E1205 23:06:35.841095 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:06:47 crc kubenswrapper[4747]: I1205 23:06:47.839870 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:06:47 crc kubenswrapper[4747]: E1205 23:06:47.840643 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:07:01 crc kubenswrapper[4747]: I1205 23:07:01.840047 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:07:01 crc kubenswrapper[4747]: E1205 23:07:01.840920 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:07:16 crc kubenswrapper[4747]: I1205 23:07:16.840782 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:07:16 crc kubenswrapper[4747]: E1205 23:07:16.843081 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.796949 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.800313 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.817989 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.840546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7ct\" (UniqueName: \"kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.840954 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.841229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.944159 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7ct\" (UniqueName: \"kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.944327 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.944389 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.944941 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.945181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:21 crc kubenswrapper[4747]: I1205 23:07:21.963912 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7ct\" (UniqueName: \"kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct\") pod \"community-operators-vf5vr\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:22 crc kubenswrapper[4747]: I1205 23:07:22.135063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:22 crc kubenswrapper[4747]: I1205 23:07:22.704259 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:22 crc kubenswrapper[4747]: W1205 23:07:22.713064 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc10e5d_5074_4846_b03b_86e4d3c94812.slice/crio-71709bb1ed3b04204c410af23cf6b40d4c41107cdad5eaf8e52fafae6e659d34 WatchSource:0}: Error finding container 71709bb1ed3b04204c410af23cf6b40d4c41107cdad5eaf8e52fafae6e659d34: Status 404 returned error can't find the container with id 71709bb1ed3b04204c410af23cf6b40d4c41107cdad5eaf8e52fafae6e659d34 Dec 05 23:07:23 crc kubenswrapper[4747]: I1205 23:07:23.452922 4747 generic.go:334] "Generic (PLEG): container finished" podID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerID="bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca" exitCode=0 Dec 05 23:07:23 crc kubenswrapper[4747]: I1205 23:07:23.453363 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerDied","Data":"bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca"} Dec 05 23:07:23 crc kubenswrapper[4747]: I1205 23:07:23.453521 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerStarted","Data":"71709bb1ed3b04204c410af23cf6b40d4c41107cdad5eaf8e52fafae6e659d34"} Dec 05 23:07:23 crc kubenswrapper[4747]: I1205 23:07:23.458433 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:07:24 crc kubenswrapper[4747]: I1205 23:07:24.465178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerStarted","Data":"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680"} Dec 05 23:07:25 crc kubenswrapper[4747]: I1205 23:07:25.474824 4747 generic.go:334] "Generic (PLEG): container finished" podID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerID="c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680" exitCode=0 Dec 05 23:07:25 crc kubenswrapper[4747]: I1205 23:07:25.474942 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerDied","Data":"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680"} Dec 05 23:07:26 crc kubenswrapper[4747]: I1205 23:07:26.487402 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerStarted","Data":"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc"} Dec 05 23:07:26 crc kubenswrapper[4747]: I1205 23:07:26.513732 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vf5vr" podStartSLOduration=3.093122227 podStartE2EDuration="5.513713563s" podCreationTimestamp="2025-12-05 23:07:21 +0000 UTC" firstStartedPulling="2025-12-05 23:07:23.458119009 +0000 UTC m=+8713.925426507" lastFinishedPulling="2025-12-05 23:07:25.878710355 +0000 UTC m=+8716.346017843" observedRunningTime="2025-12-05 23:07:26.505611243 +0000 UTC m=+8716.972918721" watchObservedRunningTime="2025-12-05 23:07:26.513713563 +0000 UTC m=+8716.981021051" Dec 05 23:07:28 crc kubenswrapper[4747]: I1205 23:07:28.840341 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:07:28 crc kubenswrapper[4747]: E1205 23:07:28.842773 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:07:32 crc kubenswrapper[4747]: I1205 23:07:32.135418 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:32 crc kubenswrapper[4747]: I1205 23:07:32.135994 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:32 crc kubenswrapper[4747]: I1205 23:07:32.202300 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:32 crc kubenswrapper[4747]: I1205 23:07:32.644680 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:32 crc kubenswrapper[4747]: I1205 23:07:32.709288 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:34 crc kubenswrapper[4747]: I1205 23:07:34.588429 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vf5vr" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="registry-server" containerID="cri-o://7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc" gracePeriod=2 Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.094431 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.177623 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities\") pod \"cdc10e5d-5074-4846-b03b-86e4d3c94812\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.177829 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content\") pod \"cdc10e5d-5074-4846-b03b-86e4d3c94812\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.177885 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7ct\" (UniqueName: \"kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct\") pod \"cdc10e5d-5074-4846-b03b-86e4d3c94812\" (UID: \"cdc10e5d-5074-4846-b03b-86e4d3c94812\") " Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.178397 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities" (OuterVolumeSpecName: "utilities") pod "cdc10e5d-5074-4846-b03b-86e4d3c94812" (UID: "cdc10e5d-5074-4846-b03b-86e4d3c94812"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.185043 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct" (OuterVolumeSpecName: "kube-api-access-np7ct") pod "cdc10e5d-5074-4846-b03b-86e4d3c94812" (UID: "cdc10e5d-5074-4846-b03b-86e4d3c94812"). InnerVolumeSpecName "kube-api-access-np7ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.228803 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdc10e5d-5074-4846-b03b-86e4d3c94812" (UID: "cdc10e5d-5074-4846-b03b-86e4d3c94812"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.280119 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.280153 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7ct\" (UniqueName: \"kubernetes.io/projected/cdc10e5d-5074-4846-b03b-86e4d3c94812-kube-api-access-np7ct\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.280166 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc10e5d-5074-4846-b03b-86e4d3c94812-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.604571 4747 generic.go:334] "Generic (PLEG): container finished" podID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerID="7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc" exitCode=0 Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.604660 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerDied","Data":"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc"} Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.604673 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vf5vr" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.604704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vf5vr" event={"ID":"cdc10e5d-5074-4846-b03b-86e4d3c94812","Type":"ContainerDied","Data":"71709bb1ed3b04204c410af23cf6b40d4c41107cdad5eaf8e52fafae6e659d34"} Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.604758 4747 scope.go:117] "RemoveContainer" containerID="7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.662961 4747 scope.go:117] "RemoveContainer" containerID="c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.665195 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.677129 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vf5vr"] Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.696406 4747 scope.go:117] "RemoveContainer" containerID="bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.764175 4747 scope.go:117] "RemoveContainer" containerID="7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc" Dec 05 23:07:35 crc kubenswrapper[4747]: E1205 23:07:35.765354 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc\": container with ID starting with 7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc not found: ID does not exist" containerID="7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.765411 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc"} err="failed to get container status \"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc\": rpc error: code = NotFound desc = could not find container \"7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc\": container with ID starting with 7022bed8ace2858518bb9333f1591382cbd74e86613c95f644da87ac53b00dfc not found: ID does not exist" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.765447 4747 scope.go:117] "RemoveContainer" containerID="c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680" Dec 05 23:07:35 crc kubenswrapper[4747]: E1205 23:07:35.765873 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680\": container with ID starting with c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680 not found: ID does not exist" containerID="c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.765915 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680"} err="failed to get container status \"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680\": rpc error: code = NotFound desc = could not find container \"c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680\": container with ID starting with c3bad3322bfbadec82305e744f60384abca34d437c2074a597cb03c4a320d680 not found: ID does not exist" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.765941 4747 scope.go:117] "RemoveContainer" containerID="bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca" Dec 05 23:07:35 crc kubenswrapper[4747]: E1205 23:07:35.766781 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca\": container with ID starting with bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca not found: ID does not exist" containerID="bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.766827 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca"} err="failed to get container status \"bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca\": rpc error: code = NotFound desc = could not find container \"bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca\": container with ID starting with bb9aeecb1c5473174f4951b84cc8376bc5e0e926adb7a5efe9fcaf46705a74ca not found: ID does not exist" Dec 05 23:07:35 crc kubenswrapper[4747]: I1205 23:07:35.876700 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" path="/var/lib/kubelet/pods/cdc10e5d-5074-4846-b03b-86e4d3c94812/volumes" Dec 05 23:07:36 crc kubenswrapper[4747]: I1205 23:07:36.616526 4747 generic.go:334] "Generic (PLEG): container finished" podID="b61e75e7-0701-45f0-a70e-8e660be43224" containerID="2cc123a67f86b9dce7d08c9b805c845b3575977c2425f42bc3524e591d0970d1" exitCode=0 Dec 05 23:07:36 crc kubenswrapper[4747]: I1205 23:07:36.616613 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" event={"ID":"b61e75e7-0701-45f0-a70e-8e660be43224","Type":"ContainerDied","Data":"2cc123a67f86b9dce7d08c9b805c845b3575977c2425f42bc3524e591d0970d1"} Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.138768 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252346 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252417 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252436 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252638 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252661 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55mz\" (UniqueName: \"kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252678 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.252707 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1\") pod \"b61e75e7-0701-45f0-a70e-8e660be43224\" (UID: \"b61e75e7-0701-45f0-a70e-8e660be43224\") " Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.258944 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz" (OuterVolumeSpecName: "kube-api-access-j55mz") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "kube-api-access-j55mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.259976 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.283868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.284068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory" (OuterVolumeSpecName: "inventory") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.298814 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.303709 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.304609 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b61e75e7-0701-45f0-a70e-8e660be43224" (UID: "b61e75e7-0701-45f0-a70e-8e660be43224"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355157 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355196 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j55mz\" (UniqueName: \"kubernetes.io/projected/b61e75e7-0701-45f0-a70e-8e660be43224-kube-api-access-j55mz\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355216 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355256 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355274 4747 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355291 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.355307 4747 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b61e75e7-0701-45f0-a70e-8e660be43224-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.641485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" event={"ID":"b61e75e7-0701-45f0-a70e-8e660be43224","Type":"ContainerDied","Data":"2bdc37681dfe101b496f7483b29333d332ecfbbdec638e022aaf880fdca7841f"} Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.641826 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bdc37681dfe101b496f7483b29333d332ecfbbdec638e022aaf880fdca7841f" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.641560 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-znq8d" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.839799 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fbvl9"] Dec 05 23:07:38 crc kubenswrapper[4747]: E1205 23:07:38.840284 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="extract-content" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840299 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="extract-content" Dec 05 23:07:38 crc kubenswrapper[4747]: E1205 23:07:38.840335 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="extract-utilities" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840344 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="extract-utilities" Dec 05 23:07:38 crc kubenswrapper[4747]: E1205 23:07:38.840359 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="registry-server" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840367 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="registry-server" Dec 05 23:07:38 crc kubenswrapper[4747]: E1205 23:07:38.840389 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61e75e7-0701-45f0-a70e-8e660be43224" containerName="telemetry-openstack-openstack-cell1" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840397 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61e75e7-0701-45f0-a70e-8e660be43224" containerName="telemetry-openstack-openstack-cell1" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840639 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc10e5d-5074-4846-b03b-86e4d3c94812" containerName="registry-server" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.840654 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61e75e7-0701-45f0-a70e-8e660be43224" containerName="telemetry-openstack-openstack-cell1" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.841546 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.850181 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.850211 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.850513 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.850559 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.850634 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.852688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fbvl9"] Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.974373 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.974446 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.974494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.974530 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsj47\" (UniqueName: \"kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:38 crc kubenswrapper[4747]: I1205 23:07:38.974653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.077113 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.077205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.077263 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.077293 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsj47\" (UniqueName: \"kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.077384 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.082526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.083601 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.084098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.085405 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.101944 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsj47\" (UniqueName: \"kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47\") pod \"neutron-sriov-openstack-openstack-cell1-fbvl9\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.173773 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:07:39 crc kubenswrapper[4747]: I1205 23:07:39.708332 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fbvl9"] Dec 05 23:07:40 crc kubenswrapper[4747]: I1205 23:07:40.660360 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" event={"ID":"24ace32a-9918-4ef5-92e7-a5ef6419e908","Type":"ContainerStarted","Data":"62cad61894d782d9fdc375c67faa0b935fa9ed3c18f66cf4999d43b31c02d1c7"} Dec 05 23:07:40 crc kubenswrapper[4747]: I1205 23:07:40.660699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" event={"ID":"24ace32a-9918-4ef5-92e7-a5ef6419e908","Type":"ContainerStarted","Data":"80cc34c4af3f4d12f55d0bccade8e58339a89c97ebb66666f78671b818a0d0c9"} Dec 05 23:07:40 crc kubenswrapper[4747]: I1205 23:07:40.684795 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" podStartSLOduration=2.133957867 podStartE2EDuration="2.684770185s" podCreationTimestamp="2025-12-05 23:07:38 +0000 UTC" firstStartedPulling="2025-12-05 23:07:39.714750359 +0000 UTC m=+8730.182057847" lastFinishedPulling="2025-12-05 23:07:40.265562647 +0000 UTC m=+8730.732870165" observedRunningTime="2025-12-05 23:07:40.674338817 +0000 UTC m=+8731.141646295" watchObservedRunningTime="2025-12-05 23:07:40.684770185 +0000 UTC m=+8731.152077673" Dec 05 23:07:40 crc kubenswrapper[4747]: I1205 23:07:40.840348 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:07:40 crc kubenswrapper[4747]: E1205 23:07:40.841009 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:07:53 crc kubenswrapper[4747]: I1205 23:07:53.852552 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:07:53 crc kubenswrapper[4747]: E1205 23:07:53.855878 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:08:06 crc kubenswrapper[4747]: I1205 23:08:06.841934 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:08:07 crc kubenswrapper[4747]: I1205 23:08:07.994924 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22"} Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.490852 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.493656 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.513026 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.629513 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.629645 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.629734 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.732839 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.732901 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.732967 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.734067 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.734372 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.767438 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg\") pod \"redhat-marketplace-kwmzf\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:09 crc kubenswrapper[4747]: I1205 23:08:09.818980 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:10 crc kubenswrapper[4747]: I1205 23:08:10.354517 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:11 crc kubenswrapper[4747]: W1205 23:08:11.285793 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19596389_9888_427f_b46d_6eb2d8380c7a.slice/crio-961218f52a46ccefa239a7730dc58580ef32b387cac8be7b665d9a836bd305a0 WatchSource:0}: Error finding container 961218f52a46ccefa239a7730dc58580ef32b387cac8be7b665d9a836bd305a0: Status 404 returned error can't find the container with id 961218f52a46ccefa239a7730dc58580ef32b387cac8be7b665d9a836bd305a0 Dec 05 23:08:12 crc kubenswrapper[4747]: I1205 23:08:12.061230 4747 generic.go:334] "Generic (PLEG): container finished" podID="19596389-9888-427f-b46d-6eb2d8380c7a" containerID="ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634" exitCode=0 Dec 05 23:08:12 crc kubenswrapper[4747]: I1205 23:08:12.061293 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerDied","Data":"ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634"} Dec 05 23:08:12 crc kubenswrapper[4747]: I1205 23:08:12.061782 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerStarted","Data":"961218f52a46ccefa239a7730dc58580ef32b387cac8be7b665d9a836bd305a0"} Dec 05 23:08:14 crc kubenswrapper[4747]: I1205 23:08:14.097290 4747 generic.go:334] "Generic (PLEG): container finished" podID="19596389-9888-427f-b46d-6eb2d8380c7a" containerID="c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c" exitCode=0 Dec 05 23:08:14 crc kubenswrapper[4747]: I1205 23:08:14.097641 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerDied","Data":"c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c"} Dec 05 23:08:15 crc kubenswrapper[4747]: I1205 23:08:15.108184 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerStarted","Data":"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb"} Dec 05 23:08:15 crc kubenswrapper[4747]: I1205 23:08:15.132981 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwmzf" podStartSLOduration=3.712132397 podStartE2EDuration="6.132958679s" podCreationTimestamp="2025-12-05 23:08:09 +0000 UTC" firstStartedPulling="2025-12-05 23:08:12.063559203 +0000 UTC m=+8762.530866701" lastFinishedPulling="2025-12-05 23:08:14.484385495 +0000 UTC m=+8764.951692983" observedRunningTime="2025-12-05 23:08:15.122313716 +0000 UTC m=+8765.589621214" watchObservedRunningTime="2025-12-05 23:08:15.132958679 +0000 UTC m=+8765.600266167" Dec 05 23:08:19 crc kubenswrapper[4747]: I1205 23:08:19.820078 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:19 crc kubenswrapper[4747]: I1205 23:08:19.820781 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:19 crc kubenswrapper[4747]: I1205 23:08:19.905304 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:20 crc kubenswrapper[4747]: I1205 23:08:20.208780 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:20 crc kubenswrapper[4747]: I1205 23:08:20.257932 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.189879 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwmzf" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="registry-server" containerID="cri-o://f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb" gracePeriod=2 Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.798038 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.871028 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg\") pod \"19596389-9888-427f-b46d-6eb2d8380c7a\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.871284 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities\") pod \"19596389-9888-427f-b46d-6eb2d8380c7a\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.871492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content\") pod \"19596389-9888-427f-b46d-6eb2d8380c7a\" (UID: \"19596389-9888-427f-b46d-6eb2d8380c7a\") " Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.873989 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities" (OuterVolumeSpecName: "utilities") pod "19596389-9888-427f-b46d-6eb2d8380c7a" (UID: "19596389-9888-427f-b46d-6eb2d8380c7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.882800 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg" (OuterVolumeSpecName: "kube-api-access-r62vg") pod "19596389-9888-427f-b46d-6eb2d8380c7a" (UID: "19596389-9888-427f-b46d-6eb2d8380c7a"). InnerVolumeSpecName "kube-api-access-r62vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.890426 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19596389-9888-427f-b46d-6eb2d8380c7a" (UID: "19596389-9888-427f-b46d-6eb2d8380c7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.974446 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/19596389-9888-427f-b46d-6eb2d8380c7a-kube-api-access-r62vg\") on node \"crc\" DevicePath \"\"" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.974492 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:08:22 crc kubenswrapper[4747]: I1205 23:08:22.974505 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19596389-9888-427f-b46d-6eb2d8380c7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.207356 4747 generic.go:334] "Generic (PLEG): container finished" podID="19596389-9888-427f-b46d-6eb2d8380c7a" containerID="f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb" exitCode=0 Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.207441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerDied","Data":"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb"} Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.207729 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwmzf" event={"ID":"19596389-9888-427f-b46d-6eb2d8380c7a","Type":"ContainerDied","Data":"961218f52a46ccefa239a7730dc58580ef32b387cac8be7b665d9a836bd305a0"} Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.207456 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwmzf" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.207756 4747 scope.go:117] "RemoveContainer" containerID="f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.253894 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.267813 4747 scope.go:117] "RemoveContainer" containerID="c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.267838 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwmzf"] Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.293789 4747 scope.go:117] "RemoveContainer" containerID="ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.345139 4747 scope.go:117] "RemoveContainer" containerID="f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb" Dec 05 23:08:23 crc kubenswrapper[4747]: E1205 23:08:23.347024 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb\": container with ID starting with f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb not found: ID does not exist" containerID="f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.347122 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb"} err="failed to get container status \"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb\": rpc error: code = NotFound desc = could not find container \"f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb\": container with ID starting with f36bbf19a441d8aee26edc2f115a20f982c4bfd62b4958b445f67d1e3ef4d1fb not found: ID does not exist" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.347191 4747 scope.go:117] "RemoveContainer" containerID="c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c" Dec 05 23:08:23 crc kubenswrapper[4747]: E1205 23:08:23.349790 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c\": container with ID starting with c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c not found: ID does not exist" containerID="c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.349842 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c"} err="failed to get container status \"c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c\": rpc error: code = NotFound desc = could not find container \"c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c\": container with ID starting with c2f8fba7389f91ee9160a660e2853ddf88565fc53737c8eac70fb153944d962c not found: ID does not exist" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.349864 4747 scope.go:117] "RemoveContainer" containerID="ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634" Dec 05 23:08:23 crc kubenswrapper[4747]: E1205 23:08:23.350125 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634\": container with ID starting with ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634 not found: ID does not exist" containerID="ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.350151 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634"} err="failed to get container status \"ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634\": rpc error: code = NotFound desc = could not find container \"ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634\": container with ID starting with ac4185f250b342ba75a9490d66e0212a0051902166e574d4b1c4aab8204de634 not found: ID does not exist" Dec 05 23:08:23 crc kubenswrapper[4747]: I1205 23:08:23.858272 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" path="/var/lib/kubelet/pods/19596389-9888-427f-b46d-6eb2d8380c7a/volumes" Dec 05 23:08:38 crc kubenswrapper[4747]: I1205 23:08:38.969146 4747 scope.go:117] "RemoveContainer" containerID="a8974e81447a620e91b96df575a74e01bd12507674253cefe288792cda7670a4" Dec 05 23:08:39 crc kubenswrapper[4747]: I1205 23:08:39.005459 4747 scope.go:117] "RemoveContainer" containerID="b1e6dfd7f4c0b3f5c85951feb23c20c667ed7251a12eb6759dcdfbc403e25046" Dec 05 23:08:39 crc kubenswrapper[4747]: I1205 23:08:39.027268 4747 scope.go:117] "RemoveContainer" containerID="1c00ca8fa1fecf7ca174e42315db8c59f0d27029d07030fe500f39fa3427b981" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.922457 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:19 crc kubenswrapper[4747]: E1205 23:09:19.923455 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="registry-server" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.923470 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="registry-server" Dec 05 23:09:19 crc kubenswrapper[4747]: E1205 23:09:19.923507 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="extract-utilities" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.923516 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="extract-utilities" Dec 05 23:09:19 crc kubenswrapper[4747]: E1205 23:09:19.923534 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="extract-content" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.923542 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="extract-content" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.923836 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="19596389-9888-427f-b46d-6eb2d8380c7a" containerName="registry-server" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.925986 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:19 crc kubenswrapper[4747]: I1205 23:09:19.962434 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.115326 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbk2z\" (UniqueName: \"kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.115407 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.115445 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.217345 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbk2z\" (UniqueName: \"kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.217416 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.217455 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.217937 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.218459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.237780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbk2z\" (UniqueName: \"kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z\") pod \"redhat-operators-d6vp2\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.267285 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:20 crc kubenswrapper[4747]: I1205 23:09:20.764847 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:21 crc kubenswrapper[4747]: I1205 23:09:21.457658 4747 generic.go:334] "Generic (PLEG): container finished" podID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerID="92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff" exitCode=0 Dec 05 23:09:21 crc kubenswrapper[4747]: I1205 23:09:21.457762 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerDied","Data":"92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff"} Dec 05 23:09:21 crc kubenswrapper[4747]: I1205 23:09:21.457997 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerStarted","Data":"a768f4a9df0a55b0c88a9be9a2d0b69b9d13391528d79a56140656b634ddff03"} Dec 05 23:09:22 crc kubenswrapper[4747]: I1205 23:09:22.471236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerStarted","Data":"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706"} Dec 05 23:09:24 crc kubenswrapper[4747]: I1205 23:09:24.491419 4747 generic.go:334] "Generic (PLEG): container finished" podID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerID="8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706" exitCode=0 Dec 05 23:09:24 crc kubenswrapper[4747]: I1205 23:09:24.491506 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerDied","Data":"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706"} Dec 05 23:09:25 crc kubenswrapper[4747]: I1205 23:09:25.502985 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerStarted","Data":"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457"} Dec 05 23:09:25 crc kubenswrapper[4747]: I1205 23:09:25.523990 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d6vp2" podStartSLOduration=3.078548677 podStartE2EDuration="6.523974912s" podCreationTimestamp="2025-12-05 23:09:19 +0000 UTC" firstStartedPulling="2025-12-05 23:09:21.459987684 +0000 UTC m=+8831.927295182" lastFinishedPulling="2025-12-05 23:09:24.905413919 +0000 UTC m=+8835.372721417" observedRunningTime="2025-12-05 23:09:25.518091286 +0000 UTC m=+8835.985398814" watchObservedRunningTime="2025-12-05 23:09:25.523974912 +0000 UTC m=+8835.991282400" Dec 05 23:09:30 crc kubenswrapper[4747]: I1205 23:09:30.268642 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:30 crc kubenswrapper[4747]: I1205 23:09:30.269552 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:31 crc kubenswrapper[4747]: I1205 23:09:31.330427 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d6vp2" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="registry-server" probeResult="failure" output=< Dec 05 23:09:31 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 23:09:31 crc kubenswrapper[4747]: > Dec 05 23:09:40 crc kubenswrapper[4747]: I1205 23:09:40.362918 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:40 crc kubenswrapper[4747]: I1205 23:09:40.445447 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:40 crc kubenswrapper[4747]: I1205 23:09:40.619792 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:41 crc kubenswrapper[4747]: I1205 23:09:41.752087 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d6vp2" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="registry-server" containerID="cri-o://85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457" gracePeriod=2 Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.393325 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.510750 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content\") pod \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.510882 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbk2z\" (UniqueName: \"kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z\") pod \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.510958 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities\") pod \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\" (UID: \"a9ed003a-9195-4428-9cc7-cd630cd05fe5\") " Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.511824 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities" (OuterVolumeSpecName: "utilities") pod "a9ed003a-9195-4428-9cc7-cd630cd05fe5" (UID: "a9ed003a-9195-4428-9cc7-cd630cd05fe5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.512336 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.520724 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z" (OuterVolumeSpecName: "kube-api-access-dbk2z") pod "a9ed003a-9195-4428-9cc7-cd630cd05fe5" (UID: "a9ed003a-9195-4428-9cc7-cd630cd05fe5"). InnerVolumeSpecName "kube-api-access-dbk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.622456 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbk2z\" (UniqueName: \"kubernetes.io/projected/a9ed003a-9195-4428-9cc7-cd630cd05fe5-kube-api-access-dbk2z\") on node \"crc\" DevicePath \"\"" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.642476 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9ed003a-9195-4428-9cc7-cd630cd05fe5" (UID: "a9ed003a-9195-4428-9cc7-cd630cd05fe5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.724522 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ed003a-9195-4428-9cc7-cd630cd05fe5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.768670 4747 generic.go:334] "Generic (PLEG): container finished" podID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerID="85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457" exitCode=0 Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.768766 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6vp2" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.768791 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerDied","Data":"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457"} Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.769164 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6vp2" event={"ID":"a9ed003a-9195-4428-9cc7-cd630cd05fe5","Type":"ContainerDied","Data":"a768f4a9df0a55b0c88a9be9a2d0b69b9d13391528d79a56140656b634ddff03"} Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.769187 4747 scope.go:117] "RemoveContainer" containerID="85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.805723 4747 scope.go:117] "RemoveContainer" containerID="8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.813639 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.822419 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d6vp2"] Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.826209 4747 scope.go:117] "RemoveContainer" containerID="92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.883247 4747 scope.go:117] "RemoveContainer" containerID="85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457" Dec 05 23:09:42 crc kubenswrapper[4747]: E1205 23:09:42.883775 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457\": container with ID starting with 85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457 not found: ID does not exist" containerID="85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.883834 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457"} err="failed to get container status \"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457\": rpc error: code = NotFound desc = could not find container \"85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457\": container with ID starting with 85ba1cfece01eb04f2026cc64233d2db1088524a442b8bbabe84a6771b670457 not found: ID does not exist" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.883868 4747 scope.go:117] "RemoveContainer" containerID="8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706" Dec 05 23:09:42 crc kubenswrapper[4747]: E1205 23:09:42.884329 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706\": container with ID starting with 8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706 not found: ID does not exist" containerID="8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.884357 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706"} err="failed to get container status \"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706\": rpc error: code = NotFound desc = could not find container \"8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706\": container with ID starting with 8f5a14f0ddba907eaef7dffc2e5261f9547f5f71db94b8b3b2e58fb187660706 not found: ID does not exist" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.884377 4747 scope.go:117] "RemoveContainer" containerID="92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff" Dec 05 23:09:42 crc kubenswrapper[4747]: E1205 23:09:42.884898 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff\": container with ID starting with 92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff not found: ID does not exist" containerID="92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff" Dec 05 23:09:42 crc kubenswrapper[4747]: I1205 23:09:42.884926 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff"} err="failed to get container status \"92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff\": rpc error: code = NotFound desc = could not find container \"92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff\": container with ID starting with 92f386d4fc7b918017142c4dc31a853c9e0bb93569ea3619b7e057e9ef68feff not found: ID does not exist" Dec 05 23:09:43 crc kubenswrapper[4747]: I1205 23:09:43.862949 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" path="/var/lib/kubelet/pods/a9ed003a-9195-4428-9cc7-cd630cd05fe5/volumes" Dec 05 23:10:36 crc kubenswrapper[4747]: I1205 23:10:36.222413 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:10:36 crc kubenswrapper[4747]: I1205 23:10:36.223269 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:11:06 crc kubenswrapper[4747]: I1205 23:11:06.222559 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:11:06 crc kubenswrapper[4747]: I1205 23:11:06.223265 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.221677 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.222360 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.222421 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.223276 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.223335 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22" gracePeriod=600 Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.900650 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22" exitCode=0 Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.901043 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22"} Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.901083 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c"} Dec 05 23:11:36 crc kubenswrapper[4747]: I1205 23:11:36.901131 4747 scope.go:117] "RemoveContainer" containerID="2893e5b55617a5932ee5aa5870a559cb2a614c10ff37002f6a119814972b44e2" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.450888 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:45 crc kubenswrapper[4747]: E1205 23:11:45.451865 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="extract-content" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.451882 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="extract-content" Dec 05 23:11:45 crc kubenswrapper[4747]: E1205 23:11:45.451904 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="extract-utilities" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.451913 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="extract-utilities" Dec 05 23:11:45 crc kubenswrapper[4747]: E1205 23:11:45.451923 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="registry-server" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.451931 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="registry-server" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.452211 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ed003a-9195-4428-9cc7-cd630cd05fe5" containerName="registry-server" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.454197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.471700 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.485312 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.485358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.485383 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qn5\" (UniqueName: \"kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.586929 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.586991 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.587026 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qn5\" (UniqueName: \"kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.587949 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.587994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.614689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qn5\" (UniqueName: \"kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5\") pod \"certified-operators-72r4v\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:45 crc kubenswrapper[4747]: I1205 23:11:45.782642 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:46 crc kubenswrapper[4747]: I1205 23:11:46.331075 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:47 crc kubenswrapper[4747]: I1205 23:11:47.019645 4747 generic.go:334] "Generic (PLEG): container finished" podID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerID="627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd" exitCode=0 Dec 05 23:11:47 crc kubenswrapper[4747]: I1205 23:11:47.019732 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerDied","Data":"627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd"} Dec 05 23:11:47 crc kubenswrapper[4747]: I1205 23:11:47.020066 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerStarted","Data":"36615280e0862a99ed9f064b5d32ab899e6eee0639a7860702dd1ddc279d0f7e"} Dec 05 23:11:49 crc kubenswrapper[4747]: I1205 23:11:49.053893 4747 generic.go:334] "Generic (PLEG): container finished" podID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerID="4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47" exitCode=0 Dec 05 23:11:49 crc kubenswrapper[4747]: I1205 23:11:49.054633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerDied","Data":"4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47"} Dec 05 23:11:50 crc kubenswrapper[4747]: I1205 23:11:50.067806 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerStarted","Data":"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487"} Dec 05 23:11:50 crc kubenswrapper[4747]: I1205 23:11:50.114960 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72r4v" podStartSLOduration=2.40471017 podStartE2EDuration="5.11492282s" podCreationTimestamp="2025-12-05 23:11:45 +0000 UTC" firstStartedPulling="2025-12-05 23:11:47.022060806 +0000 UTC m=+8977.489368294" lastFinishedPulling="2025-12-05 23:11:49.732273456 +0000 UTC m=+8980.199580944" observedRunningTime="2025-12-05 23:11:50.09548752 +0000 UTC m=+8980.562795028" watchObservedRunningTime="2025-12-05 23:11:50.11492282 +0000 UTC m=+8980.582230308" Dec 05 23:11:55 crc kubenswrapper[4747]: I1205 23:11:55.785128 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:55 crc kubenswrapper[4747]: I1205 23:11:55.786069 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:55 crc kubenswrapper[4747]: I1205 23:11:55.893049 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:56 crc kubenswrapper[4747]: I1205 23:11:56.216632 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:56 crc kubenswrapper[4747]: I1205 23:11:56.271326 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.164406 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72r4v" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="registry-server" containerID="cri-o://133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487" gracePeriod=2 Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.810712 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.918031 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities\") pod \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.918224 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content\") pod \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.918254 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qn5\" (UniqueName: \"kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5\") pod \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\" (UID: \"d5ba0408-5f22-4fa6-93be-813ae2e70a2e\") " Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.919152 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities" (OuterVolumeSpecName: "utilities") pod "d5ba0408-5f22-4fa6-93be-813ae2e70a2e" (UID: "d5ba0408-5f22-4fa6-93be-813ae2e70a2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.928868 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5" (OuterVolumeSpecName: "kube-api-access-q9qn5") pod "d5ba0408-5f22-4fa6-93be-813ae2e70a2e" (UID: "d5ba0408-5f22-4fa6-93be-813ae2e70a2e"). InnerVolumeSpecName "kube-api-access-q9qn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:11:58 crc kubenswrapper[4747]: I1205 23:11:58.976972 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5ba0408-5f22-4fa6-93be-813ae2e70a2e" (UID: "d5ba0408-5f22-4fa6-93be-813ae2e70a2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.020894 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qn5\" (UniqueName: \"kubernetes.io/projected/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-kube-api-access-q9qn5\") on node \"crc\" DevicePath \"\"" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.020945 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.020955 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ba0408-5f22-4fa6-93be-813ae2e70a2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.174692 4747 generic.go:334] "Generic (PLEG): container finished" podID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerID="133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487" exitCode=0 Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.174737 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerDied","Data":"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487"} Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.174752 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72r4v" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.174771 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72r4v" event={"ID":"d5ba0408-5f22-4fa6-93be-813ae2e70a2e","Type":"ContainerDied","Data":"36615280e0862a99ed9f064b5d32ab899e6eee0639a7860702dd1ddc279d0f7e"} Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.174794 4747 scope.go:117] "RemoveContainer" containerID="133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.196450 4747 scope.go:117] "RemoveContainer" containerID="4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.214715 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.225299 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72r4v"] Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.230547 4747 scope.go:117] "RemoveContainer" containerID="627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.266736 4747 scope.go:117] "RemoveContainer" containerID="133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487" Dec 05 23:11:59 crc kubenswrapper[4747]: E1205 23:11:59.267262 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487\": container with ID starting with 133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487 not found: ID does not exist" containerID="133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.267316 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487"} err="failed to get container status \"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487\": rpc error: code = NotFound desc = could not find container \"133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487\": container with ID starting with 133e0fe61aec1482395787c6c649aa838a3f0c381a118f7b8f8e5a8e91d8e487 not found: ID does not exist" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.267349 4747 scope.go:117] "RemoveContainer" containerID="4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47" Dec 05 23:11:59 crc kubenswrapper[4747]: E1205 23:11:59.267667 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47\": container with ID starting with 4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47 not found: ID does not exist" containerID="4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.267707 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47"} err="failed to get container status \"4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47\": rpc error: code = NotFound desc = could not find container \"4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47\": container with ID starting with 4e651acf2f870879c358bd3bebb9b99e8dba74408f652d0d9f03361d30809a47 not found: ID does not exist" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.267729 4747 scope.go:117] "RemoveContainer" containerID="627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd" Dec 05 23:11:59 crc kubenswrapper[4747]: E1205 23:11:59.267940 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd\": container with ID starting with 627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd not found: ID does not exist" containerID="627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.267968 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd"} err="failed to get container status \"627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd\": rpc error: code = NotFound desc = could not find container \"627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd\": container with ID starting with 627e3426c671e0865a4dfd83c0fd74aad2215dfc11417096934539ff7e00afcd not found: ID does not exist" Dec 05 23:11:59 crc kubenswrapper[4747]: I1205 23:11:59.864080 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" path="/var/lib/kubelet/pods/d5ba0408-5f22-4fa6-93be-813ae2e70a2e/volumes" Dec 05 23:13:36 crc kubenswrapper[4747]: I1205 23:13:36.222150 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:13:36 crc kubenswrapper[4747]: I1205 23:13:36.222757 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:13:36 crc kubenswrapper[4747]: I1205 23:13:36.320889 4747 generic.go:334] "Generic (PLEG): container finished" podID="24ace32a-9918-4ef5-92e7-a5ef6419e908" containerID="62cad61894d782d9fdc375c67faa0b935fa9ed3c18f66cf4999d43b31c02d1c7" exitCode=0 Dec 05 23:13:36 crc kubenswrapper[4747]: I1205 23:13:36.321145 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" event={"ID":"24ace32a-9918-4ef5-92e7-a5ef6419e908","Type":"ContainerDied","Data":"62cad61894d782d9fdc375c67faa0b935fa9ed3c18f66cf4999d43b31c02d1c7"} Dec 05 23:13:37 crc kubenswrapper[4747]: I1205 23:13:37.829949 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.019728 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle\") pod \"24ace32a-9918-4ef5-92e7-a5ef6419e908\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.020195 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0\") pod \"24ace32a-9918-4ef5-92e7-a5ef6419e908\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.020355 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory\") pod \"24ace32a-9918-4ef5-92e7-a5ef6419e908\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.020432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key\") pod \"24ace32a-9918-4ef5-92e7-a5ef6419e908\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.020456 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsj47\" (UniqueName: \"kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47\") pod \"24ace32a-9918-4ef5-92e7-a5ef6419e908\" (UID: \"24ace32a-9918-4ef5-92e7-a5ef6419e908\") " Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.026781 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47" (OuterVolumeSpecName: "kube-api-access-rsj47") pod "24ace32a-9918-4ef5-92e7-a5ef6419e908" (UID: "24ace32a-9918-4ef5-92e7-a5ef6419e908"). InnerVolumeSpecName "kube-api-access-rsj47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.029881 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "24ace32a-9918-4ef5-92e7-a5ef6419e908" (UID: "24ace32a-9918-4ef5-92e7-a5ef6419e908"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.052482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "24ace32a-9918-4ef5-92e7-a5ef6419e908" (UID: "24ace32a-9918-4ef5-92e7-a5ef6419e908"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.058104 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "24ace32a-9918-4ef5-92e7-a5ef6419e908" (UID: "24ace32a-9918-4ef5-92e7-a5ef6419e908"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.067844 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory" (OuterVolumeSpecName: "inventory") pod "24ace32a-9918-4ef5-92e7-a5ef6419e908" (UID: "24ace32a-9918-4ef5-92e7-a5ef6419e908"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.123193 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.123446 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.123505 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsj47\" (UniqueName: \"kubernetes.io/projected/24ace32a-9918-4ef5-92e7-a5ef6419e908-kube-api-access-rsj47\") on node \"crc\" DevicePath \"\"" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.123600 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.123673 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24ace32a-9918-4ef5-92e7-a5ef6419e908-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.340929 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" event={"ID":"24ace32a-9918-4ef5-92e7-a5ef6419e908","Type":"ContainerDied","Data":"80cc34c4af3f4d12f55d0bccade8e58339a89c97ebb66666f78671b818a0d0c9"} Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.341255 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80cc34c4af3f4d12f55d0bccade8e58339a89c97ebb66666f78671b818a0d0c9" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.341363 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fbvl9" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.584042 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-9txj2"] Dec 05 23:13:38 crc kubenswrapper[4747]: E1205 23:13:38.584478 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="extract-utilities" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.584495 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="extract-utilities" Dec 05 23:13:38 crc kubenswrapper[4747]: E1205 23:13:38.584517 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ace32a-9918-4ef5-92e7-a5ef6419e908" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.584523 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ace32a-9918-4ef5-92e7-a5ef6419e908" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 23:13:38 crc kubenswrapper[4747]: E1205 23:13:38.584551 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="registry-server" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.584558 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="registry-server" Dec 05 23:13:38 crc kubenswrapper[4747]: E1205 23:13:38.584573 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="extract-content" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.585271 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="extract-content" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.585593 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ba0408-5f22-4fa6-93be-813ae2e70a2e" containerName="registry-server" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.585609 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ace32a-9918-4ef5-92e7-a5ef6419e908" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.586651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.589375 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.589426 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.589375 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.589710 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.589840 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.595308 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-9txj2"] Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.642824 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.642942 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.643007 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.643026 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.643064 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.745338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.745786 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.745817 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.745876 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.745983 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.749414 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.749545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.751893 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.761286 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.765790 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2\") pod \"neutron-dhcp-openstack-openstack-cell1-9txj2\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:38 crc kubenswrapper[4747]: I1205 23:13:38.905011 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:13:39 crc kubenswrapper[4747]: I1205 23:13:39.488571 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-9txj2"] Dec 05 23:13:39 crc kubenswrapper[4747]: I1205 23:13:39.492376 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:13:40 crc kubenswrapper[4747]: I1205 23:13:40.366835 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" event={"ID":"150d0c65-26e4-483f-aaf6-72d7efe808a3","Type":"ContainerStarted","Data":"7eddc0ad26ba71da0d055e9c70beccd42d8e7d336032e07beed8932b043139ae"} Dec 05 23:13:40 crc kubenswrapper[4747]: I1205 23:13:40.367374 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" event={"ID":"150d0c65-26e4-483f-aaf6-72d7efe808a3","Type":"ContainerStarted","Data":"030e2f560826740701e78ebe3e5bcfae183e47fafa0c89b59e47089ced6292ff"} Dec 05 23:14:06 crc kubenswrapper[4747]: I1205 23:14:06.222668 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:14:06 crc kubenswrapper[4747]: I1205 23:14:06.223236 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:14:36 crc kubenswrapper[4747]: I1205 23:14:36.221827 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:14:36 crc kubenswrapper[4747]: I1205 23:14:36.222496 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:14:36 crc kubenswrapper[4747]: I1205 23:14:36.222632 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:14:36 crc kubenswrapper[4747]: I1205 23:14:36.223855 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:14:36 crc kubenswrapper[4747]: I1205 23:14:36.223955 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" gracePeriod=600 Dec 05 23:14:37 crc kubenswrapper[4747]: I1205 23:14:37.039739 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" exitCode=0 Dec 05 23:14:37 crc kubenswrapper[4747]: I1205 23:14:37.039892 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c"} Dec 05 23:14:37 crc kubenswrapper[4747]: I1205 23:14:37.040054 4747 scope.go:117] "RemoveContainer" containerID="ed426bd4bec4c9eaca98b78abec81641719b640a69f6f4b4896fb2b9e80d6d22" Dec 05 23:14:37 crc kubenswrapper[4747]: E1205 23:14:37.327304 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:14:38 crc kubenswrapper[4747]: I1205 23:14:38.058716 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:14:38 crc kubenswrapper[4747]: E1205 23:14:38.059843 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:14:38 crc kubenswrapper[4747]: I1205 23:14:38.087327 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" podStartSLOduration=59.531540853 podStartE2EDuration="1m0.087273033s" podCreationTimestamp="2025-12-05 23:13:38 +0000 UTC" firstStartedPulling="2025-12-05 23:13:39.492174967 +0000 UTC m=+9089.959482455" lastFinishedPulling="2025-12-05 23:13:40.047907137 +0000 UTC m=+9090.515214635" observedRunningTime="2025-12-05 23:13:40.385859737 +0000 UTC m=+9090.853167225" watchObservedRunningTime="2025-12-05 23:14:38.087273033 +0000 UTC m=+9148.554580561" Dec 05 23:14:52 crc kubenswrapper[4747]: I1205 23:14:52.839892 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:14:52 crc kubenswrapper[4747]: E1205 23:14:52.840832 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.177779 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7"] Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.182299 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.186035 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.186370 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.205961 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7"] Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.241463 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.241624 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.241751 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8f6\" (UniqueName: \"kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.344617 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.344710 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.344779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8f6\" (UniqueName: \"kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.346166 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.352059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.372287 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8f6\" (UniqueName: \"kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6\") pod \"collect-profiles-29416275-td2q7\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:00 crc kubenswrapper[4747]: I1205 23:15:00.519309 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:01 crc kubenswrapper[4747]: I1205 23:15:01.042912 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7"] Dec 05 23:15:01 crc kubenswrapper[4747]: I1205 23:15:01.365215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" event={"ID":"334974c5-1ed3-4592-a9e5-d02e3b7e276b","Type":"ContainerStarted","Data":"31bb18946de783b57741bcaef7b1265d468f7719b2ae42fcb44e571e3dffc3a1"} Dec 05 23:15:02 crc kubenswrapper[4747]: I1205 23:15:02.378620 4747 generic.go:334] "Generic (PLEG): container finished" podID="334974c5-1ed3-4592-a9e5-d02e3b7e276b" containerID="4c8bf545717b82dd0e3d31228eaa9f45fd00b8a33a070f8ebb13085f5a56c092" exitCode=0 Dec 05 23:15:02 crc kubenswrapper[4747]: I1205 23:15:02.378715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" event={"ID":"334974c5-1ed3-4592-a9e5-d02e3b7e276b","Type":"ContainerDied","Data":"4c8bf545717b82dd0e3d31228eaa9f45fd00b8a33a070f8ebb13085f5a56c092"} Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.763161 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.832276 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume\") pod \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.832374 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc8f6\" (UniqueName: \"kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6\") pod \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.832442 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume\") pod \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\" (UID: \"334974c5-1ed3-4592-a9e5-d02e3b7e276b\") " Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.834220 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume" (OuterVolumeSpecName: "config-volume") pod "334974c5-1ed3-4592-a9e5-d02e3b7e276b" (UID: "334974c5-1ed3-4592-a9e5-d02e3b7e276b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.839324 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6" (OuterVolumeSpecName: "kube-api-access-kc8f6") pod "334974c5-1ed3-4592-a9e5-d02e3b7e276b" (UID: "334974c5-1ed3-4592-a9e5-d02e3b7e276b"). InnerVolumeSpecName "kube-api-access-kc8f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.842120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "334974c5-1ed3-4592-a9e5-d02e3b7e276b" (UID: "334974c5-1ed3-4592-a9e5-d02e3b7e276b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.934796 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334974c5-1ed3-4592-a9e5-d02e3b7e276b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.934823 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc8f6\" (UniqueName: \"kubernetes.io/projected/334974c5-1ed3-4592-a9e5-d02e3b7e276b-kube-api-access-kc8f6\") on node \"crc\" DevicePath \"\"" Dec 05 23:15:03 crc kubenswrapper[4747]: I1205 23:15:03.934833 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/334974c5-1ed3-4592-a9e5-d02e3b7e276b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:15:04 crc kubenswrapper[4747]: I1205 23:15:04.406553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" event={"ID":"334974c5-1ed3-4592-a9e5-d02e3b7e276b","Type":"ContainerDied","Data":"31bb18946de783b57741bcaef7b1265d468f7719b2ae42fcb44e571e3dffc3a1"} Dec 05 23:15:04 crc kubenswrapper[4747]: I1205 23:15:04.407203 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bb18946de783b57741bcaef7b1265d468f7719b2ae42fcb44e571e3dffc3a1" Dec 05 23:15:04 crc kubenswrapper[4747]: I1205 23:15:04.406756 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-td2q7" Dec 05 23:15:04 crc kubenswrapper[4747]: I1205 23:15:04.858842 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd"] Dec 05 23:15:04 crc kubenswrapper[4747]: I1205 23:15:04.870756 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416230-2kqqd"] Dec 05 23:15:05 crc kubenswrapper[4747]: I1205 23:15:05.852437 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3033d749-826d-47fc-ace8-c06f765415a7" path="/var/lib/kubelet/pods/3033d749-826d-47fc-ace8-c06f765415a7/volumes" Dec 05 23:15:07 crc kubenswrapper[4747]: I1205 23:15:07.839779 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:15:07 crc kubenswrapper[4747]: E1205 23:15:07.840264 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:15:21 crc kubenswrapper[4747]: I1205 23:15:21.844820 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:15:21 crc kubenswrapper[4747]: E1205 23:15:21.845703 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:15:32 crc kubenswrapper[4747]: I1205 23:15:32.840166 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:15:32 crc kubenswrapper[4747]: E1205 23:15:32.841030 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:15:39 crc kubenswrapper[4747]: I1205 23:15:39.387803 4747 scope.go:117] "RemoveContainer" containerID="defc4b565a59578fd9e755a9b56a1fae2fc79e6c9f2cd377d7a9c44523202eff" Dec 05 23:15:47 crc kubenswrapper[4747]: I1205 23:15:47.840705 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:15:47 crc kubenswrapper[4747]: E1205 23:15:47.841757 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:16:01 crc kubenswrapper[4747]: I1205 23:16:01.840403 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:16:01 crc kubenswrapper[4747]: E1205 23:16:01.841260 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:16:16 crc kubenswrapper[4747]: I1205 23:16:16.841823 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:16:16 crc kubenswrapper[4747]: E1205 23:16:16.842880 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:16:28 crc kubenswrapper[4747]: I1205 23:16:28.840401 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:16:28 crc kubenswrapper[4747]: E1205 23:16:28.842566 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:16:41 crc kubenswrapper[4747]: I1205 23:16:41.840363 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:16:41 crc kubenswrapper[4747]: E1205 23:16:41.843432 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:16:56 crc kubenswrapper[4747]: I1205 23:16:56.840483 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:16:56 crc kubenswrapper[4747]: E1205 23:16:56.841336 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:17:09 crc kubenswrapper[4747]: I1205 23:17:09.866540 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:17:09 crc kubenswrapper[4747]: E1205 23:17:09.867908 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:17:23 crc kubenswrapper[4747]: I1205 23:17:23.840392 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:17:23 crc kubenswrapper[4747]: E1205 23:17:23.841252 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:17:35 crc kubenswrapper[4747]: I1205 23:17:35.843426 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:17:35 crc kubenswrapper[4747]: E1205 23:17:35.844742 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:17:46 crc kubenswrapper[4747]: I1205 23:17:46.840923 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:17:46 crc kubenswrapper[4747]: E1205 23:17:46.841680 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:17:58 crc kubenswrapper[4747]: I1205 23:17:58.839738 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:17:58 crc kubenswrapper[4747]: E1205 23:17:58.840697 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:18:09 crc kubenswrapper[4747]: I1205 23:18:09.851310 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:18:09 crc kubenswrapper[4747]: E1205 23:18:09.852141 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.196893 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:13 crc kubenswrapper[4747]: E1205 23:18:13.197871 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334974c5-1ed3-4592-a9e5-d02e3b7e276b" containerName="collect-profiles" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.197884 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="334974c5-1ed3-4592-a9e5-d02e3b7e276b" containerName="collect-profiles" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.198161 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="334974c5-1ed3-4592-a9e5-d02e3b7e276b" containerName="collect-profiles" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.199727 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.206387 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.301011 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.301379 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.301546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7t92\" (UniqueName: \"kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.404086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7t92\" (UniqueName: \"kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.404263 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.404322 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.404793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.405175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.615305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7t92\" (UniqueName: \"kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92\") pod \"community-operators-5v8p7\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:13 crc kubenswrapper[4747]: I1205 23:18:13.821375 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:14 crc kubenswrapper[4747]: I1205 23:18:14.317727 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:14 crc kubenswrapper[4747]: I1205 23:18:14.510458 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerStarted","Data":"cec05f9f3e920adb2762823ada40d1803ef3483c4f9d0856828373a49c9f42bc"} Dec 05 23:18:15 crc kubenswrapper[4747]: I1205 23:18:15.522140 4747 generic.go:334] "Generic (PLEG): container finished" podID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerID="3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b" exitCode=0 Dec 05 23:18:15 crc kubenswrapper[4747]: I1205 23:18:15.522214 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerDied","Data":"3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b"} Dec 05 23:18:16 crc kubenswrapper[4747]: I1205 23:18:16.550182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerStarted","Data":"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6"} Dec 05 23:18:18 crc kubenswrapper[4747]: I1205 23:18:18.577131 4747 generic.go:334] "Generic (PLEG): container finished" podID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerID="63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6" exitCode=0 Dec 05 23:18:18 crc kubenswrapper[4747]: I1205 23:18:18.577221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerDied","Data":"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6"} Dec 05 23:18:19 crc kubenswrapper[4747]: I1205 23:18:19.589869 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerStarted","Data":"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db"} Dec 05 23:18:19 crc kubenswrapper[4747]: I1205 23:18:19.616710 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5v8p7" podStartSLOduration=3.058102912 podStartE2EDuration="6.616680713s" podCreationTimestamp="2025-12-05 23:18:13 +0000 UTC" firstStartedPulling="2025-12-05 23:18:15.524529277 +0000 UTC m=+9365.991836765" lastFinishedPulling="2025-12-05 23:18:19.083107078 +0000 UTC m=+9369.550414566" observedRunningTime="2025-12-05 23:18:19.608298485 +0000 UTC m=+9370.075605993" watchObservedRunningTime="2025-12-05 23:18:19.616680713 +0000 UTC m=+9370.083988221" Dec 05 23:18:23 crc kubenswrapper[4747]: I1205 23:18:23.822535 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:23 crc kubenswrapper[4747]: I1205 23:18:23.823298 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:23 crc kubenswrapper[4747]: I1205 23:18:23.840728 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:18:23 crc kubenswrapper[4747]: E1205 23:18:23.841205 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:18:23 crc kubenswrapper[4747]: I1205 23:18:23.893181 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:24 crc kubenswrapper[4747]: I1205 23:18:24.719710 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:25 crc kubenswrapper[4747]: I1205 23:18:25.799705 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:26 crc kubenswrapper[4747]: I1205 23:18:26.669125 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5v8p7" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="registry-server" containerID="cri-o://6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db" gracePeriod=2 Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.191070 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.342339 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7t92\" (UniqueName: \"kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92\") pod \"bda3d143-a01b-4c41-bc0a-99bf57356765\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.342471 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities\") pod \"bda3d143-a01b-4c41-bc0a-99bf57356765\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.342790 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content\") pod \"bda3d143-a01b-4c41-bc0a-99bf57356765\" (UID: \"bda3d143-a01b-4c41-bc0a-99bf57356765\") " Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.345167 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities" (OuterVolumeSpecName: "utilities") pod "bda3d143-a01b-4c41-bc0a-99bf57356765" (UID: "bda3d143-a01b-4c41-bc0a-99bf57356765"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.354915 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92" (OuterVolumeSpecName: "kube-api-access-l7t92") pod "bda3d143-a01b-4c41-bc0a-99bf57356765" (UID: "bda3d143-a01b-4c41-bc0a-99bf57356765"). InnerVolumeSpecName "kube-api-access-l7t92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.416191 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bda3d143-a01b-4c41-bc0a-99bf57356765" (UID: "bda3d143-a01b-4c41-bc0a-99bf57356765"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.445788 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.445823 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda3d143-a01b-4c41-bc0a-99bf57356765-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.445835 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7t92\" (UniqueName: \"kubernetes.io/projected/bda3d143-a01b-4c41-bc0a-99bf57356765-kube-api-access-l7t92\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.685625 4747 generic.go:334] "Generic (PLEG): container finished" podID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerID="6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db" exitCode=0 Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.685703 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerDied","Data":"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db"} Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.685762 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v8p7" event={"ID":"bda3d143-a01b-4c41-bc0a-99bf57356765","Type":"ContainerDied","Data":"cec05f9f3e920adb2762823ada40d1803ef3483c4f9d0856828373a49c9f42bc"} Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.685799 4747 scope.go:117] "RemoveContainer" containerID="6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.685723 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v8p7" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.708065 4747 scope.go:117] "RemoveContainer" containerID="63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.733768 4747 scope.go:117] "RemoveContainer" containerID="3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b" Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.811217 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.820998 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5v8p7"] Dec 05 23:18:27 crc kubenswrapper[4747]: I1205 23:18:27.851026 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" path="/var/lib/kubelet/pods/bda3d143-a01b-4c41-bc0a-99bf57356765/volumes" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.620002 4747 scope.go:117] "RemoveContainer" containerID="6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db" Dec 05 23:18:28 crc kubenswrapper[4747]: E1205 23:18:28.624518 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db\": container with ID starting with 6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db not found: ID does not exist" containerID="6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.624597 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db"} err="failed to get container status \"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db\": rpc error: code = NotFound desc = could not find container \"6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db\": container with ID starting with 6da0a0f762bb8349fd324b2efef221d964cc97fd45ec55006e5bcb97151c69db not found: ID does not exist" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.624633 4747 scope.go:117] "RemoveContainer" containerID="63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6" Dec 05 23:18:28 crc kubenswrapper[4747]: E1205 23:18:28.634774 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6\": container with ID starting with 63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6 not found: ID does not exist" containerID="63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.634835 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6"} err="failed to get container status \"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6\": rpc error: code = NotFound desc = could not find container \"63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6\": container with ID starting with 63720eb4e93165aaee1780b86231e55f4cb5bb574cbd7e5f98c6e4aa1c9489e6 not found: ID does not exist" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.634866 4747 scope.go:117] "RemoveContainer" containerID="3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b" Dec 05 23:18:28 crc kubenswrapper[4747]: E1205 23:18:28.639731 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b\": container with ID starting with 3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b not found: ID does not exist" containerID="3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b" Dec 05 23:18:28 crc kubenswrapper[4747]: I1205 23:18:28.639777 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b"} err="failed to get container status \"3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b\": rpc error: code = NotFound desc = could not find container \"3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b\": container with ID starting with 3dd94825126af395fbbdab2d1eb5d91634288a971492426362b0640fce447b4b not found: ID does not exist" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.601243 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:33 crc kubenswrapper[4747]: E1205 23:18:33.602795 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="registry-server" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.602822 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="registry-server" Dec 05 23:18:33 crc kubenswrapper[4747]: E1205 23:18:33.602866 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="extract-utilities" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.602879 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="extract-utilities" Dec 05 23:18:33 crc kubenswrapper[4747]: E1205 23:18:33.602935 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="extract-content" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.602948 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="extract-content" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.603304 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda3d143-a01b-4c41-bc0a-99bf57356765" containerName="registry-server" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.607184 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.674318 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.695782 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.695934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5q7\" (UniqueName: \"kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.696021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.798943 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.799103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5q7\" (UniqueName: \"kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.799179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.799612 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.799897 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.818927 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5q7\" (UniqueName: \"kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7\") pod \"redhat-marketplace-gmvwz\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:33 crc kubenswrapper[4747]: I1205 23:18:33.981957 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:34 crc kubenswrapper[4747]: I1205 23:18:34.531527 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:34 crc kubenswrapper[4747]: I1205 23:18:34.804537 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerStarted","Data":"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5"} Dec 05 23:18:34 crc kubenswrapper[4747]: I1205 23:18:34.804596 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerStarted","Data":"d605f2cd44850a8b524c05eaf3267943b30bc0628d4550992d9597b3f3ee9008"} Dec 05 23:18:34 crc kubenswrapper[4747]: I1205 23:18:34.839931 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:18:34 crc kubenswrapper[4747]: E1205 23:18:34.840271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:18:35 crc kubenswrapper[4747]: I1205 23:18:35.829799 4747 generic.go:334] "Generic (PLEG): container finished" podID="b048d7d8-161e-44b7-a58e-19efea534815" containerID="dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5" exitCode=0 Dec 05 23:18:35 crc kubenswrapper[4747]: I1205 23:18:35.830050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerDied","Data":"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5"} Dec 05 23:18:35 crc kubenswrapper[4747]: I1205 23:18:35.830320 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerStarted","Data":"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d"} Dec 05 23:18:36 crc kubenswrapper[4747]: I1205 23:18:36.847876 4747 generic.go:334] "Generic (PLEG): container finished" podID="b048d7d8-161e-44b7-a58e-19efea534815" containerID="7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d" exitCode=0 Dec 05 23:18:36 crc kubenswrapper[4747]: I1205 23:18:36.848109 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerDied","Data":"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d"} Dec 05 23:18:37 crc kubenswrapper[4747]: I1205 23:18:37.857980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerStarted","Data":"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69"} Dec 05 23:18:37 crc kubenswrapper[4747]: I1205 23:18:37.878703 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gmvwz" podStartSLOduration=2.433746942 podStartE2EDuration="4.878685188s" podCreationTimestamp="2025-12-05 23:18:33 +0000 UTC" firstStartedPulling="2025-12-05 23:18:34.80754714 +0000 UTC m=+9385.274854628" lastFinishedPulling="2025-12-05 23:18:37.252485376 +0000 UTC m=+9387.719792874" observedRunningTime="2025-12-05 23:18:37.875620182 +0000 UTC m=+9388.342927680" watchObservedRunningTime="2025-12-05 23:18:37.878685188 +0000 UTC m=+9388.345992676" Dec 05 23:18:43 crc kubenswrapper[4747]: I1205 23:18:43.982739 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:43 crc kubenswrapper[4747]: I1205 23:18:43.983235 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:44 crc kubenswrapper[4747]: I1205 23:18:44.038498 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:45 crc kubenswrapper[4747]: I1205 23:18:45.000754 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:45 crc kubenswrapper[4747]: I1205 23:18:45.059893 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:46 crc kubenswrapper[4747]: I1205 23:18:46.957531 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gmvwz" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="registry-server" containerID="cri-o://4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69" gracePeriod=2 Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.400964 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.523686 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities\") pod \"b048d7d8-161e-44b7-a58e-19efea534815\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.523735 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w5q7\" (UniqueName: \"kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7\") pod \"b048d7d8-161e-44b7-a58e-19efea534815\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.523822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content\") pod \"b048d7d8-161e-44b7-a58e-19efea534815\" (UID: \"b048d7d8-161e-44b7-a58e-19efea534815\") " Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.524656 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities" (OuterVolumeSpecName: "utilities") pod "b048d7d8-161e-44b7-a58e-19efea534815" (UID: "b048d7d8-161e-44b7-a58e-19efea534815"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.531106 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7" (OuterVolumeSpecName: "kube-api-access-4w5q7") pod "b048d7d8-161e-44b7-a58e-19efea534815" (UID: "b048d7d8-161e-44b7-a58e-19efea534815"). InnerVolumeSpecName "kube-api-access-4w5q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.543778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b048d7d8-161e-44b7-a58e-19efea534815" (UID: "b048d7d8-161e-44b7-a58e-19efea534815"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.626548 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.626585 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w5q7\" (UniqueName: \"kubernetes.io/projected/b048d7d8-161e-44b7-a58e-19efea534815-kube-api-access-4w5q7\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.626618 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b048d7d8-161e-44b7-a58e-19efea534815-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.839667 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:18:47 crc kubenswrapper[4747]: E1205 23:18:47.840023 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.972694 4747 generic.go:334] "Generic (PLEG): container finished" podID="b048d7d8-161e-44b7-a58e-19efea534815" containerID="4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69" exitCode=0 Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.972755 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmvwz" Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.972765 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerDied","Data":"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69"} Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.972817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmvwz" event={"ID":"b048d7d8-161e-44b7-a58e-19efea534815","Type":"ContainerDied","Data":"d605f2cd44850a8b524c05eaf3267943b30bc0628d4550992d9597b3f3ee9008"} Dec 05 23:18:47 crc kubenswrapper[4747]: I1205 23:18:47.972849 4747 scope.go:117] "RemoveContainer" containerID="4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.005433 4747 scope.go:117] "RemoveContainer" containerID="7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.010256 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.027770 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmvwz"] Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.035292 4747 scope.go:117] "RemoveContainer" containerID="dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.116784 4747 scope.go:117] "RemoveContainer" containerID="4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69" Dec 05 23:18:48 crc kubenswrapper[4747]: E1205 23:18:48.117659 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69\": container with ID starting with 4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69 not found: ID does not exist" containerID="4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.117705 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69"} err="failed to get container status \"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69\": rpc error: code = NotFound desc = could not find container \"4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69\": container with ID starting with 4cebc7e5e90bb364d414e0e5cb746589b8420fff94ff5837790040eb1cbb7d69 not found: ID does not exist" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.117731 4747 scope.go:117] "RemoveContainer" containerID="7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d" Dec 05 23:18:48 crc kubenswrapper[4747]: E1205 23:18:48.118561 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d\": container with ID starting with 7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d not found: ID does not exist" containerID="7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.118610 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d"} err="failed to get container status \"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d\": rpc error: code = NotFound desc = could not find container \"7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d\": container with ID starting with 7a0560305fb958ee3390f4a7fdefff874c85bb2e8cef70b7b03894b6b4b7aa4d not found: ID does not exist" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.118633 4747 scope.go:117] "RemoveContainer" containerID="dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5" Dec 05 23:18:48 crc kubenswrapper[4747]: E1205 23:18:48.121909 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5\": container with ID starting with dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5 not found: ID does not exist" containerID="dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5" Dec 05 23:18:48 crc kubenswrapper[4747]: I1205 23:18:48.122127 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5"} err="failed to get container status \"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5\": rpc error: code = NotFound desc = could not find container \"dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5\": container with ID starting with dde7b8cef56aba2527beb460a143be5d7af4f8a17da36760eb85f8e2f23c98f5 not found: ID does not exist" Dec 05 23:18:49 crc kubenswrapper[4747]: I1205 23:18:49.869479 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b048d7d8-161e-44b7-a58e-19efea534815" path="/var/lib/kubelet/pods/b048d7d8-161e-44b7-a58e-19efea534815/volumes" Dec 05 23:19:01 crc kubenswrapper[4747]: I1205 23:19:01.839790 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:19:01 crc kubenswrapper[4747]: E1205 23:19:01.840606 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:19:12 crc kubenswrapper[4747]: I1205 23:19:12.839761 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:19:12 crc kubenswrapper[4747]: E1205 23:19:12.840569 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:19:25 crc kubenswrapper[4747]: I1205 23:19:25.839846 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:19:25 crc kubenswrapper[4747]: E1205 23:19:25.840502 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:19:39 crc kubenswrapper[4747]: I1205 23:19:39.857058 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:19:40 crc kubenswrapper[4747]: I1205 23:19:40.615887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd"} Dec 05 23:21:19 crc kubenswrapper[4747]: I1205 23:21:19.714835 4747 generic.go:334] "Generic (PLEG): container finished" podID="150d0c65-26e4-483f-aaf6-72d7efe808a3" containerID="7eddc0ad26ba71da0d055e9c70beccd42d8e7d336032e07beed8932b043139ae" exitCode=0 Dec 05 23:21:19 crc kubenswrapper[4747]: I1205 23:21:19.714902 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" event={"ID":"150d0c65-26e4-483f-aaf6-72d7efe808a3","Type":"ContainerDied","Data":"7eddc0ad26ba71da0d055e9c70beccd42d8e7d336032e07beed8932b043139ae"} Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.254865 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.385282 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2\") pod \"150d0c65-26e4-483f-aaf6-72d7efe808a3\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.385681 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle\") pod \"150d0c65-26e4-483f-aaf6-72d7efe808a3\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.385710 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory\") pod \"150d0c65-26e4-483f-aaf6-72d7efe808a3\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.385750 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key\") pod \"150d0c65-26e4-483f-aaf6-72d7efe808a3\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.385836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0\") pod \"150d0c65-26e4-483f-aaf6-72d7efe808a3\" (UID: \"150d0c65-26e4-483f-aaf6-72d7efe808a3\") " Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.390603 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "150d0c65-26e4-483f-aaf6-72d7efe808a3" (UID: "150d0c65-26e4-483f-aaf6-72d7efe808a3"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.391180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2" (OuterVolumeSpecName: "kube-api-access-8x8f2") pod "150d0c65-26e4-483f-aaf6-72d7efe808a3" (UID: "150d0c65-26e4-483f-aaf6-72d7efe808a3"). InnerVolumeSpecName "kube-api-access-8x8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.416442 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory" (OuterVolumeSpecName: "inventory") pod "150d0c65-26e4-483f-aaf6-72d7efe808a3" (UID: "150d0c65-26e4-483f-aaf6-72d7efe808a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.419861 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "150d0c65-26e4-483f-aaf6-72d7efe808a3" (UID: "150d0c65-26e4-483f-aaf6-72d7efe808a3"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.442525 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "150d0c65-26e4-483f-aaf6-72d7efe808a3" (UID: "150d0c65-26e4-483f-aaf6-72d7efe808a3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.488022 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8f2\" (UniqueName: \"kubernetes.io/projected/150d0c65-26e4-483f-aaf6-72d7efe808a3-kube-api-access-8x8f2\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.488058 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.488070 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.488079 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.488087 4747 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/150d0c65-26e4-483f-aaf6-72d7efe808a3-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.741633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" event={"ID":"150d0c65-26e4-483f-aaf6-72d7efe808a3","Type":"ContainerDied","Data":"030e2f560826740701e78ebe3e5bcfae183e47fafa0c89b59e47089ced6292ff"} Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.741716 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030e2f560826740701e78ebe3e5bcfae183e47fafa0c89b59e47089ced6292ff" Dec 05 23:21:21 crc kubenswrapper[4747]: I1205 23:21:21.741800 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-9txj2" Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.280668 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.281488 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f54319f260021d4cfd8b47e01a0e119217532abc6e284ee4505758961ce2bea9" gracePeriod=30 Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.904479 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.904910 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerName="nova-cell1-conductor-conductor" containerID="cri-o://dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" gracePeriod=30 Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.953609 4747 generic.go:334] "Generic (PLEG): container finished" podID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" containerID="f54319f260021d4cfd8b47e01a0e119217532abc6e284ee4505758961ce2bea9" exitCode=0 Dec 05 23:21:40 crc kubenswrapper[4747]: I1205 23:21:40.953656 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46ffb070-be50-4f11-a1a6-ab2c958f1fb1","Type":"ContainerDied","Data":"f54319f260021d4cfd8b47e01a0e119217532abc6e284ee4505758961ce2bea9"} Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.056178 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.056407 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" containerName="nova-scheduler-scheduler" containerID="cri-o://45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95" gracePeriod=30 Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.072933 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.073298 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-api" containerID="cri-o://e8b9709959f131e7489b8cd63e9a52cd8973922d616ca8018fa61fe10fefa266" gracePeriod=30 Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.073258 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-log" containerID="cri-o://3213ee748a4ef5c991530470015f6722cab8f4de8d1ad6487e1358fe41b171f0" gracePeriod=30 Dec 05 23:21:41 crc kubenswrapper[4747]: E1205 23:21:41.139429 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 23:21:41 crc kubenswrapper[4747]: E1205 23:21:41.140544 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 23:21:41 crc kubenswrapper[4747]: E1205 23:21:41.141681 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 23:21:41 crc kubenswrapper[4747]: E1205 23:21:41.141725 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerName="nova-cell1-conductor-conductor" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.154239 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.154486 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" containerID="cri-o://2480e7522c51b23c99de4b385425e590f4aaadfe63180019d1de74e63234aaf2" gracePeriod=30 Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.154547 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" containerID="cri-o://f77a9f91a96faba41eb76d1bf1fcdf75944f7141e569bb9e721b2ef8d9b12447" gracePeriod=30 Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.359129 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.473413 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle\") pod \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.473798 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data\") pod \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.474040 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmjg\" (UniqueName: \"kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg\") pod \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\" (UID: \"46ffb070-be50-4f11-a1a6-ab2c958f1fb1\") " Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.491339 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg" (OuterVolumeSpecName: "kube-api-access-fvmjg") pod "46ffb070-be50-4f11-a1a6-ab2c958f1fb1" (UID: "46ffb070-be50-4f11-a1a6-ab2c958f1fb1"). InnerVolumeSpecName "kube-api-access-fvmjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.515680 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46ffb070-be50-4f11-a1a6-ab2c958f1fb1" (UID: "46ffb070-be50-4f11-a1a6-ab2c958f1fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.525714 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data" (OuterVolumeSpecName: "config-data") pod "46ffb070-be50-4f11-a1a6-ab2c958f1fb1" (UID: "46ffb070-be50-4f11-a1a6-ab2c958f1fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.576092 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.576128 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.576138 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmjg\" (UniqueName: \"kubernetes.io/projected/46ffb070-be50-4f11-a1a6-ab2c958f1fb1-kube-api-access-fvmjg\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.988949 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.989421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46ffb070-be50-4f11-a1a6-ab2c958f1fb1","Type":"ContainerDied","Data":"2240df827285bdd5668868ad97c3418945219f11ed96dc942bd3d01efee0f121"} Dec 05 23:21:41 crc kubenswrapper[4747]: I1205 23:21:41.989468 4747 scope.go:117] "RemoveContainer" containerID="f54319f260021d4cfd8b47e01a0e119217532abc6e284ee4505758961ce2bea9" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.014259 4747 generic.go:334] "Generic (PLEG): container finished" podID="69a28134-11bc-411d-b294-87782bf28560" containerID="2480e7522c51b23c99de4b385425e590f4aaadfe63180019d1de74e63234aaf2" exitCode=143 Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.014601 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerDied","Data":"2480e7522c51b23c99de4b385425e590f4aaadfe63180019d1de74e63234aaf2"} Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.016949 4747 generic.go:334] "Generic (PLEG): container finished" podID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerID="dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" exitCode=0 Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.017009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2d56e61d-3374-4b9b-8062-98ece9f4cb96","Type":"ContainerDied","Data":"dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825"} Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.021063 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerID="3213ee748a4ef5c991530470015f6722cab8f4de8d1ad6487e1358fe41b171f0" exitCode=143 Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.021298 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerDied","Data":"3213ee748a4ef5c991530470015f6722cab8f4de8d1ad6487e1358fe41b171f0"} Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.025903 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.035609 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.060568 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:42 crc kubenswrapper[4747]: E1205 23:21:42.061044 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="extract-content" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061063 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="extract-content" Dec 05 23:21:42 crc kubenswrapper[4747]: E1205 23:21:42.061083 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="registry-server" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061090 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="registry-server" Dec 05 23:21:42 crc kubenswrapper[4747]: E1205 23:21:42.061121 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150d0c65-26e4-483f-aaf6-72d7efe808a3" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061128 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="150d0c65-26e4-483f-aaf6-72d7efe808a3" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 05 23:21:42 crc kubenswrapper[4747]: E1205 23:21:42.061142 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" containerName="nova-cell0-conductor-conductor" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061148 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" containerName="nova-cell0-conductor-conductor" Dec 05 23:21:42 crc kubenswrapper[4747]: E1205 23:21:42.061159 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="extract-utilities" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061164 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="extract-utilities" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061379 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" containerName="nova-cell0-conductor-conductor" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061389 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="150d0c65-26e4-483f-aaf6-72d7efe808a3" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.061420 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b048d7d8-161e-44b7-a58e-19efea534815" containerName="registry-server" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.062159 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.064176 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.072821 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.194765 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.194820 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qj7\" (UniqueName: \"kubernetes.io/projected/4b9536e5-5379-4d1e-acd6-92259c28a784-kube-api-access-q4qj7\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.194845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.305127 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.305195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qj7\" (UniqueName: \"kubernetes.io/projected/4b9536e5-5379-4d1e-acd6-92259c28a784-kube-api-access-q4qj7\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.305222 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.312321 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.313001 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9536e5-5379-4d1e-acd6-92259c28a784-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.321326 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qj7\" (UniqueName: \"kubernetes.io/projected/4b9536e5-5379-4d1e-acd6-92259c28a784-kube-api-access-q4qj7\") pod \"nova-cell0-conductor-0\" (UID: \"4b9536e5-5379-4d1e-acd6-92259c28a784\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.437427 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.503630 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.611560 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle\") pod \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.611654 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data\") pod \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.611773 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rtp\" (UniqueName: \"kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp\") pod \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\" (UID: \"2d56e61d-3374-4b9b-8062-98ece9f4cb96\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.619815 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp" (OuterVolumeSpecName: "kube-api-access-45rtp") pod "2d56e61d-3374-4b9b-8062-98ece9f4cb96" (UID: "2d56e61d-3374-4b9b-8062-98ece9f4cb96"). InnerVolumeSpecName "kube-api-access-45rtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.647901 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data" (OuterVolumeSpecName: "config-data") pod "2d56e61d-3374-4b9b-8062-98ece9f4cb96" (UID: "2d56e61d-3374-4b9b-8062-98ece9f4cb96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.651220 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d56e61d-3374-4b9b-8062-98ece9f4cb96" (UID: "2d56e61d-3374-4b9b-8062-98ece9f4cb96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.713882 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rtp\" (UniqueName: \"kubernetes.io/projected/2d56e61d-3374-4b9b-8062-98ece9f4cb96-kube-api-access-45rtp\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.713915 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.713924 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d56e61d-3374-4b9b-8062-98ece9f4cb96-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.786660 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.916798 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs\") pod \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.917079 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle\") pod \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.917136 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data\") pod \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\" (UID: \"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5\") " Dec 05 23:21:42 crc kubenswrapper[4747]: I1205 23:21:42.992466 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.032426 4747 generic.go:334] "Generic (PLEG): container finished" podID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" containerID="45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95" exitCode=0 Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.032519 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.032547 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5","Type":"ContainerDied","Data":"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95"} Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.033498 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6566b8ad-86e6-4bf4-bce0-eb43d671d2c5","Type":"ContainerDied","Data":"1d297ddff21c1d19b996116684468cb70488e6a428c231f1f3a131f1a5737b8e"} Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.033555 4747 scope.go:117] "RemoveContainer" containerID="45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.035906 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2d56e61d-3374-4b9b-8062-98ece9f4cb96","Type":"ContainerDied","Data":"582837c99b3b5df5c9c0d3cda74b5facfd46862decac69e120f70096282b2107"} Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.035956 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.483980 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs" (OuterVolumeSpecName: "kube-api-access-lk5rs") pod "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" (UID: "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5"). InnerVolumeSpecName "kube-api-access-lk5rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.536898 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-kube-api-access-lk5rs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.539860 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" (UID: "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.542100 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data" (OuterVolumeSpecName: "config-data") pod "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" (UID: "6566b8ad-86e6-4bf4-bce0-eb43d671d2c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.639999 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.640028 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.752705 4747 scope.go:117] "RemoveContainer" containerID="45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95" Dec 05 23:21:43 crc kubenswrapper[4747]: E1205 23:21:43.753946 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95\": container with ID starting with 45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95 not found: ID does not exist" containerID="45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.753980 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95"} err="failed to get container status \"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95\": rpc error: code = NotFound desc = could not find container \"45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95\": container with ID starting with 45d6e9122d67d75705df7d99aa588243cbc4d1db236aee879c920c49c5932d95 not found: ID does not exist" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.754004 4747 scope.go:117] "RemoveContainer" containerID="dc552af861c97729dd8b2077f62a76f515c5f5c22ec0458cd9cd5b65f640c825" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.782621 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.816670 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.826427 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.882538 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ffb070-be50-4f11-a1a6-ab2c958f1fb1" path="/var/lib/kubelet/pods/46ffb070-be50-4f11-a1a6-ab2c958f1fb1/volumes" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.883090 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" path="/var/lib/kubelet/pods/6566b8ad-86e6-4bf4-bce0-eb43d671d2c5/volumes" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.883952 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.884492 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: E1205 23:21:43.885335 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerName="nova-cell1-conductor-conductor" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.885436 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerName="nova-cell1-conductor-conductor" Dec 05 23:21:43 crc kubenswrapper[4747]: E1205 23:21:43.885526 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" containerName="nova-scheduler-scheduler" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.885599 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" containerName="nova-scheduler-scheduler" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.885895 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" containerName="nova-cell1-conductor-conductor" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.885980 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6566b8ad-86e6-4bf4-bce0-eb43d671d2c5" containerName="nova-scheduler-scheduler" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.886742 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.888555 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.896538 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.902413 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.906812 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.908768 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:43 crc kubenswrapper[4747]: I1205 23:21:43.922217 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.047559 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b9536e5-5379-4d1e-acd6-92259c28a784","Type":"ContainerStarted","Data":"fe48a316062e5901df8efac00b248b25ae15203e6d7ba2562da20cb7709a8d88"} Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.047618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b9536e5-5379-4d1e-acd6-92259c28a784","Type":"ContainerStarted","Data":"fd4099302d9214d2343a8273aacfa4a05f38d5c4dc93079966fcce891d00a036"} Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.048948 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.068914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsg6\" (UniqueName: \"kubernetes.io/projected/151ab70a-f3ce-4993-9c10-d0674b8208eb-kube-api-access-kjsg6\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.068980 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.069021 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4h8\" (UniqueName: \"kubernetes.io/projected/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-kube-api-access-vf4h8\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.069049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.069106 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-config-data\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.069131 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.071625 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.071603327 podStartE2EDuration="2.071603327s" podCreationTimestamp="2025-12-05 23:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:44.069434003 +0000 UTC m=+9574.536741491" watchObservedRunningTime="2025-12-05 23:21:44.071603327 +0000 UTC m=+9574.538910825" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.171664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-config-data\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.171751 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.171938 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjsg6\" (UniqueName: \"kubernetes.io/projected/151ab70a-f3ce-4993-9c10-d0674b8208eb-kube-api-access-kjsg6\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.173156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.174361 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4h8\" (UniqueName: \"kubernetes.io/projected/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-kube-api-access-vf4h8\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.174443 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.178519 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-config-data\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.179296 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.179618 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.180844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151ab70a-f3ce-4993-9c10-d0674b8208eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.193344 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjsg6\" (UniqueName: \"kubernetes.io/projected/151ab70a-f3ce-4993-9c10-d0674b8208eb-kube-api-access-kjsg6\") pod \"nova-cell1-conductor-0\" (UID: \"151ab70a-f3ce-4993-9c10-d0674b8208eb\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.196513 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4h8\" (UniqueName: \"kubernetes.io/projected/ab9ee7da-b527-4208-a7c5-51a6b1a2db01-kube-api-access-vf4h8\") pod \"nova-scheduler-0\" (UID: \"ab9ee7da-b527-4208-a7c5-51a6b1a2db01\") " pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.220772 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.229380 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.607841 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.90:8775/\": read tcp 10.217.0.2:54164->10.217.1.90:8775: read: connection reset by peer" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.607901 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.90:8775/\": read tcp 10.217.0.2:54166->10.217.1.90:8775: read: connection reset by peer" Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.873009 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:21:44 crc kubenswrapper[4747]: I1205 23:21:44.940084 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.064648 4747 generic.go:334] "Generic (PLEG): container finished" podID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerID="e8b9709959f131e7489b8cd63e9a52cd8973922d616ca8018fa61fe10fefa266" exitCode=0 Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.064746 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerDied","Data":"e8b9709959f131e7489b8cd63e9a52cd8973922d616ca8018fa61fe10fefa266"} Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.064818 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c37dfb7-6f76-43d5-b699-320372d1c35b","Type":"ContainerDied","Data":"a8e50c1820f3b8951531ef214e397397a8108506a821ba355a12720a228dd109"} Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.064840 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e50c1820f3b8951531ef214e397397a8108506a821ba355a12720a228dd109" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.067668 4747 generic.go:334] "Generic (PLEG): container finished" podID="69a28134-11bc-411d-b294-87782bf28560" containerID="f77a9f91a96faba41eb76d1bf1fcdf75944f7141e569bb9e721b2ef8d9b12447" exitCode=0 Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.067753 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerDied","Data":"f77a9f91a96faba41eb76d1bf1fcdf75944f7141e569bb9e721b2ef8d9b12447"} Dec 05 23:21:45 crc kubenswrapper[4747]: W1205 23:21:45.387723 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151ab70a_f3ce_4993_9c10_d0674b8208eb.slice/crio-ea13f1bdac16762763f11cec41b238f021eaaf4cbc319917cf77fed4993d5eea WatchSource:0}: Error finding container ea13f1bdac16762763f11cec41b238f021eaaf4cbc319917cf77fed4993d5eea: Status 404 returned error can't find the container with id ea13f1bdac16762763f11cec41b238f021eaaf4cbc319917cf77fed4993d5eea Dec 05 23:21:45 crc kubenswrapper[4747]: W1205 23:21:45.390953 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9ee7da_b527_4208_a7c5_51a6b1a2db01.slice/crio-46bfce52a19b95c99835e53b3e65b6d8878ee197320940b563913070afbfbc31 WatchSource:0}: Error finding container 46bfce52a19b95c99835e53b3e65b6d8878ee197320940b563913070afbfbc31: Status 404 returned error can't find the container with id 46bfce52a19b95c99835e53b3e65b6d8878ee197320940b563913070afbfbc31 Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.712185 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.748285 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.820238 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.820488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn5sr\" (UniqueName: \"kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.821064 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.821213 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.821371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.821474 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data\") pod \"8c37dfb7-6f76-43d5-b699-320372d1c35b\" (UID: \"8c37dfb7-6f76-43d5-b699-320372d1c35b\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.821072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs" (OuterVolumeSpecName: "logs") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.846872 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr" (OuterVolumeSpecName: "kube-api-access-nn5sr") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "kube-api-access-nn5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.886609 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d56e61d-3374-4b9b-8062-98ece9f4cb96" path="/var/lib/kubelet/pods/2d56e61d-3374-4b9b-8062-98ece9f4cb96/volumes" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.931839 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs\") pod \"69a28134-11bc-411d-b294-87782bf28560\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.931888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data\") pod \"69a28134-11bc-411d-b294-87782bf28560\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.931952 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xdsr\" (UniqueName: \"kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr\") pod \"69a28134-11bc-411d-b294-87782bf28560\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.932008 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs\") pod \"69a28134-11bc-411d-b294-87782bf28560\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.932056 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle\") pod \"69a28134-11bc-411d-b294-87782bf28560\" (UID: \"69a28134-11bc-411d-b294-87782bf28560\") " Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.932462 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c37dfb7-6f76-43d5-b699-320372d1c35b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.932479 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn5sr\" (UniqueName: \"kubernetes.io/projected/8c37dfb7-6f76-43d5-b699-320372d1c35b-kube-api-access-nn5sr\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:45 crc kubenswrapper[4747]: I1205 23:21:45.932548 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs" (OuterVolumeSpecName: "logs") pod "69a28134-11bc-411d-b294-87782bf28560" (UID: "69a28134-11bc-411d-b294-87782bf28560"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.040019 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a28134-11bc-411d-b294-87782bf28560-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.065869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr" (OuterVolumeSpecName: "kube-api-access-8xdsr") pod "69a28134-11bc-411d-b294-87782bf28560" (UID: "69a28134-11bc-411d-b294-87782bf28560"). InnerVolumeSpecName "kube-api-access-8xdsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.085689 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"151ab70a-f3ce-4993-9c10-d0674b8208eb","Type":"ContainerStarted","Data":"ea13f1bdac16762763f11cec41b238f021eaaf4cbc319917cf77fed4993d5eea"} Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.087434 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab9ee7da-b527-4208-a7c5-51a6b1a2db01","Type":"ContainerStarted","Data":"46bfce52a19b95c99835e53b3e65b6d8878ee197320940b563913070afbfbc31"} Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.090284 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.091194 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.091382 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69a28134-11bc-411d-b294-87782bf28560","Type":"ContainerDied","Data":"9754cd8b21a0fb1f8490cfa4c4abf3a492166e7d3c24c2131504af56a54d90c6"} Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.091414 4747 scope.go:117] "RemoveContainer" containerID="f77a9f91a96faba41eb76d1bf1fcdf75944f7141e569bb9e721b2ef8d9b12447" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.093700 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.101646 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data" (OuterVolumeSpecName: "config-data") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.141711 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xdsr\" (UniqueName: \"kubernetes.io/projected/69a28134-11bc-411d-b294-87782bf28560-kube-api-access-8xdsr\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.141748 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.141760 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.145120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69a28134-11bc-411d-b294-87782bf28560" (UID: "69a28134-11bc-411d-b294-87782bf28560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.173814 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.175665 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c37dfb7-6f76-43d5-b699-320372d1c35b" (UID: "8c37dfb7-6f76-43d5-b699-320372d1c35b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.203104 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data" (OuterVolumeSpecName: "config-data") pod "69a28134-11bc-411d-b294-87782bf28560" (UID: "69a28134-11bc-411d-b294-87782bf28560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.227224 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "69a28134-11bc-411d-b294-87782bf28560" (UID: "69a28134-11bc-411d-b294-87782bf28560"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244294 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244337 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244351 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244363 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a28134-11bc-411d-b294-87782bf28560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244373 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c37dfb7-6f76-43d5-b699-320372d1c35b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.244826 4747 scope.go:117] "RemoveContainer" containerID="2480e7522c51b23c99de4b385425e590f4aaadfe63180019d1de74e63234aaf2" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.436908 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.447351 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.457830 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.472648 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.494200 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: E1205 23:21:46.495363 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.495391 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" Dec 05 23:21:46 crc kubenswrapper[4747]: E1205 23:21:46.495451 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.495461 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" Dec 05 23:21:46 crc kubenswrapper[4747]: E1205 23:21:46.495505 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-log" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.495518 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-log" Dec 05 23:21:46 crc kubenswrapper[4747]: E1205 23:21:46.495551 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-api" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.495560 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-api" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.497071 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-api" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.497107 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-log" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.502819 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" containerName="nova-api-log" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.502887 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a28134-11bc-411d-b294-87782bf28560" containerName="nova-metadata-metadata" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.509607 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.513476 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.514246 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.517307 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.521621 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.544782 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.547839 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.552631 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.552942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.567113 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657214 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657286 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657352 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-public-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657386 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-internal-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657507 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-logs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657692 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jzb\" (UniqueName: \"kubernetes.io/projected/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-kube-api-access-26jzb\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657792 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-logs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657871 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-config-data\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.657990 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtp5c\" (UniqueName: \"kubernetes.io/projected/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-kube-api-access-gtp5c\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.658075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-config-data\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.759657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-logs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760080 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jzb\" (UniqueName: \"kubernetes.io/projected/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-kube-api-access-26jzb\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760133 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-logs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760139 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-logs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760262 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-config-data\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtp5c\" (UniqueName: \"kubernetes.io/projected/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-kube-api-access-gtp5c\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-config-data\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760514 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760571 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-public-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760736 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760783 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-logs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.760884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-internal-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.764987 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-public-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.765179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.766202 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.767164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.773188 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-internal-tls-certs\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.779955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-config-data\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.780273 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-config-data\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.780898 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtp5c\" (UniqueName: \"kubernetes.io/projected/abd9d55c-b859-4c21-8a3f-cdfd09e33bd4-kube-api-access-gtp5c\") pod \"nova-metadata-0\" (UID: \"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4\") " pod="openstack/nova-metadata-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.794025 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jzb\" (UniqueName: \"kubernetes.io/projected/924d0e7d-e7b6-4a55-8a58-87ea4a01ec38-kube-api-access-26jzb\") pod \"nova-api-0\" (UID: \"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38\") " pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.847533 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:21:46 crc kubenswrapper[4747]: I1205 23:21:46.876410 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.129668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab9ee7da-b527-4208-a7c5-51a6b1a2db01","Type":"ContainerStarted","Data":"5211a2d6772628cc2009b2222be4502ff08bb332b5574666e1b2b88f94185d35"} Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.155237 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"151ab70a-f3ce-4993-9c10-d0674b8208eb","Type":"ContainerStarted","Data":"f5487b0272902275522c3aa35d54907c494909c65d07df07031d6af0d579a490"} Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.155402 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.163718 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.163683572 podStartE2EDuration="4.163683572s" podCreationTimestamp="2025-12-05 23:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:47.151106251 +0000 UTC m=+9577.618413739" watchObservedRunningTime="2025-12-05 23:21:47.163683572 +0000 UTC m=+9577.630991060" Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.173632 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=4.173609187 podStartE2EDuration="4.173609187s" podCreationTimestamp="2025-12-05 23:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:47.170619963 +0000 UTC m=+9577.637927471" watchObservedRunningTime="2025-12-05 23:21:47.173609187 +0000 UTC m=+9577.640916675" Dec 05 23:21:47 crc kubenswrapper[4747]: W1205 23:21:47.427261 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924d0e7d_e7b6_4a55_8a58_87ea4a01ec38.slice/crio-b1667496e166697cc05060aaa09fc12a86ed5e49f29effa8c615ee5a6beb4da5 WatchSource:0}: Error finding container b1667496e166697cc05060aaa09fc12a86ed5e49f29effa8c615ee5a6beb4da5: Status 404 returned error can't find the container with id b1667496e166697cc05060aaa09fc12a86ed5e49f29effa8c615ee5a6beb4da5 Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.432653 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.510082 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:21:47 crc kubenswrapper[4747]: W1205 23:21:47.521989 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabd9d55c_b859_4c21_8a3f_cdfd09e33bd4.slice/crio-405b3fb1239a57ea55461db9b8b91f0e6b029c2fe9f7687dc5f0f737c8332a0a WatchSource:0}: Error finding container 405b3fb1239a57ea55461db9b8b91f0e6b029c2fe9f7687dc5f0f737c8332a0a: Status 404 returned error can't find the container with id 405b3fb1239a57ea55461db9b8b91f0e6b029c2fe9f7687dc5f0f737c8332a0a Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.852476 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a28134-11bc-411d-b294-87782bf28560" path="/var/lib/kubelet/pods/69a28134-11bc-411d-b294-87782bf28560/volumes" Dec 05 23:21:47 crc kubenswrapper[4747]: I1205 23:21:47.853718 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c37dfb7-6f76-43d5-b699-320372d1c35b" path="/var/lib/kubelet/pods/8c37dfb7-6f76-43d5-b699-320372d1c35b/volumes" Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.167449 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38","Type":"ContainerStarted","Data":"8f3c45bda2a66812ef3197fde4f1450fcf88d2cf44096c2b469a2a238170ef9b"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.167504 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38","Type":"ContainerStarted","Data":"7c0700fa968bd342b18ee8a1199977e6f203c6fca152ab77670108a36f29dcf7"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.167519 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"924d0e7d-e7b6-4a55-8a58-87ea4a01ec38","Type":"ContainerStarted","Data":"b1667496e166697cc05060aaa09fc12a86ed5e49f29effa8c615ee5a6beb4da5"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.172245 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4","Type":"ContainerStarted","Data":"d6f7be6d799f884232cfdeae4aad843afd9a322f47b54ce441d3e6574aba68ae"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.172287 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4","Type":"ContainerStarted","Data":"be2207e45fc5dc16c53d1747fdf7533a080985742ac1c5cff432392ebcc9e4c8"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.172300 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abd9d55c-b859-4c21-8a3f-cdfd09e33bd4","Type":"ContainerStarted","Data":"405b3fb1239a57ea55461db9b8b91f0e6b029c2fe9f7687dc5f0f737c8332a0a"} Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.192999 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.192977013 podStartE2EDuration="2.192977013s" podCreationTimestamp="2025-12-05 23:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:48.188460831 +0000 UTC m=+9578.655768329" watchObservedRunningTime="2025-12-05 23:21:48.192977013 +0000 UTC m=+9578.660284501" Dec 05 23:21:48 crc kubenswrapper[4747]: I1205 23:21:48.214560 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.214536225 podStartE2EDuration="2.214536225s" podCreationTimestamp="2025-12-05 23:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:48.209594433 +0000 UTC m=+9578.676901921" watchObservedRunningTime="2025-12-05 23:21:48.214536225 +0000 UTC m=+9578.681843713" Dec 05 23:21:49 crc kubenswrapper[4747]: I1205 23:21:49.221986 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 23:21:51 crc kubenswrapper[4747]: I1205 23:21:51.876772 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:21:51 crc kubenswrapper[4747]: I1205 23:21:51.877331 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:21:52 crc kubenswrapper[4747]: I1205 23:21:52.483701 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 23:21:54 crc kubenswrapper[4747]: I1205 23:21:54.221934 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 23:21:54 crc kubenswrapper[4747]: I1205 23:21:54.264103 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 23:21:54 crc kubenswrapper[4747]: I1205 23:21:54.264714 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 23:21:54 crc kubenswrapper[4747]: I1205 23:21:54.319366 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 23:21:56 crc kubenswrapper[4747]: I1205 23:21:56.848670 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:21:56 crc kubenswrapper[4747]: I1205 23:21:56.849029 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:21:56 crc kubenswrapper[4747]: I1205 23:21:56.877229 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:21:56 crc kubenswrapper[4747]: I1205 23:21:56.877325 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:21:57 crc kubenswrapper[4747]: I1205 23:21:57.865182 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="924d0e7d-e7b6-4a55-8a58-87ea4a01ec38" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:21:57 crc kubenswrapper[4747]: I1205 23:21:57.865193 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="924d0e7d-e7b6-4a55-8a58-87ea4a01ec38" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:21:57 crc kubenswrapper[4747]: I1205 23:21:57.891748 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="abd9d55c-b859-4c21-8a3f-cdfd09e33bd4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:21:57 crc kubenswrapper[4747]: I1205 23:21:57.892276 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="abd9d55c-b859-4c21-8a3f-cdfd09e33bd4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.222573 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.223033 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.859341 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.860211 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.867245 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.877128 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.895411 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.903360 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:22:06 crc kubenswrapper[4747]: I1205 23:22:06.904066 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:22:07 crc kubenswrapper[4747]: I1205 23:22:07.388285 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:22:07 crc kubenswrapper[4747]: I1205 23:22:07.393665 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:22:07 crc kubenswrapper[4747]: I1205 23:22:07.394337 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.653881 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m"] Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.655780 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.658054 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.658363 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.658463 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dwdzw" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.659501 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.661759 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.661759 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.662718 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.671773 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m"] Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722504 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722561 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722610 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722722 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88sr2\" (UniqueName: \"kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722771 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722815 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722882 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.722974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.723036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825087 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825149 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825171 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825234 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88sr2\" (UniqueName: \"kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825260 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825298 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825400 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.825471 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:08 crc kubenswrapper[4747]: I1205 23:22:08.826310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.482316 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.483349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.483366 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.483474 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.483962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.489496 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88sr2\" (UniqueName: \"kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.490310 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.490885 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:09 crc kubenswrapper[4747]: I1205 23:22:09.587014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:22:10 crc kubenswrapper[4747]: I1205 23:22:10.133299 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m"] Dec 05 23:22:10 crc kubenswrapper[4747]: I1205 23:22:10.150935 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:22:10 crc kubenswrapper[4747]: I1205 23:22:10.413262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" event={"ID":"29780f9c-1925-4bc3-9a1d-24f9723dbcb9","Type":"ContainerStarted","Data":"511be5b0d346afb6ddc327e0b44155902dec094e38023d1066ca608f3c15a943"} Dec 05 23:22:10 crc kubenswrapper[4747]: I1205 23:22:10.632785 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:22:11 crc kubenswrapper[4747]: I1205 23:22:11.428919 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" event={"ID":"29780f9c-1925-4bc3-9a1d-24f9723dbcb9","Type":"ContainerStarted","Data":"8405bb1a38771702cdee21317f26972b041a3c71ad07466ca9037a3cc41b7da9"} Dec 05 23:22:11 crc kubenswrapper[4747]: I1205 23:22:11.456823 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" podStartSLOduration=2.977938435 podStartE2EDuration="3.456797996s" podCreationTimestamp="2025-12-05 23:22:08 +0000 UTC" firstStartedPulling="2025-12-05 23:22:10.15044088 +0000 UTC m=+9600.617748378" lastFinishedPulling="2025-12-05 23:22:10.629300451 +0000 UTC m=+9601.096607939" observedRunningTime="2025-12-05 23:22:11.444449011 +0000 UTC m=+9601.911756509" watchObservedRunningTime="2025-12-05 23:22:11.456797996 +0000 UTC m=+9601.924105494" Dec 05 23:22:36 crc kubenswrapper[4747]: I1205 23:22:36.222306 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:22:36 crc kubenswrapper[4747]: I1205 23:22:36.223743 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:22:39 crc kubenswrapper[4747]: I1205 23:22:39.622688 4747 scope.go:117] "RemoveContainer" containerID="e8b9709959f131e7489b8cd63e9a52cd8973922d616ca8018fa61fe10fefa266" Dec 05 23:22:39 crc kubenswrapper[4747]: I1205 23:22:39.654685 4747 scope.go:117] "RemoveContainer" containerID="3213ee748a4ef5c991530470015f6722cab8f4de8d1ad6487e1358fe41b171f0" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.805550 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.816894 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.824921 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.881689 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.882140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbgw\" (UniqueName: \"kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.882208 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.985028 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.985102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbgw\" (UniqueName: \"kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.985124 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.986020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:44 crc kubenswrapper[4747]: I1205 23:22:44.986726 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:45 crc kubenswrapper[4747]: I1205 23:22:45.382227 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbgw\" (UniqueName: \"kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw\") pod \"certified-operators-2lzjr\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:45 crc kubenswrapper[4747]: I1205 23:22:45.460899 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:22:46 crc kubenswrapper[4747]: I1205 23:22:46.055406 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:22:46 crc kubenswrapper[4747]: I1205 23:22:46.818882 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerStarted","Data":"27330e2ca5b0b935f313f87c895c7120bfe26110906cf4b645d1a32a19e1c193"} Dec 05 23:22:49 crc kubenswrapper[4747]: I1205 23:22:49.856146 4747 generic.go:334] "Generic (PLEG): container finished" podID="86691a73-82d9-4cca-a54f-d81201462a76" containerID="0f3d81db0f20783d10d2ed487aa1e8e12002c8a7fcc32467dfde6ba4682bdd6f" exitCode=0 Dec 05 23:22:49 crc kubenswrapper[4747]: I1205 23:22:49.856717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerDied","Data":"0f3d81db0f20783d10d2ed487aa1e8e12002c8a7fcc32467dfde6ba4682bdd6f"} Dec 05 23:22:56 crc kubenswrapper[4747]: I1205 23:22:56.927691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerStarted","Data":"a5e453c1cf05badff7bb6b5105e9f14ac9aa2624d1a3850b5465827fc8adb84e"} Dec 05 23:22:58 crc kubenswrapper[4747]: I1205 23:22:58.948525 4747 generic.go:334] "Generic (PLEG): container finished" podID="86691a73-82d9-4cca-a54f-d81201462a76" containerID="a5e453c1cf05badff7bb6b5105e9f14ac9aa2624d1a3850b5465827fc8adb84e" exitCode=0 Dec 05 23:22:58 crc kubenswrapper[4747]: I1205 23:22:58.948624 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerDied","Data":"a5e453c1cf05badff7bb6b5105e9f14ac9aa2624d1a3850b5465827fc8adb84e"} Dec 05 23:22:59 crc kubenswrapper[4747]: I1205 23:22:59.961086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerStarted","Data":"4c7aeb2514830e0de2b1d508888a8529225b6d8a55af1e754ecd5a2d9ce0a2bd"} Dec 05 23:22:59 crc kubenswrapper[4747]: I1205 23:22:59.988570 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2lzjr" podStartSLOduration=6.228025969 podStartE2EDuration="15.98855224s" podCreationTimestamp="2025-12-05 23:22:44 +0000 UTC" firstStartedPulling="2025-12-05 23:22:49.8597563 +0000 UTC m=+9640.327063788" lastFinishedPulling="2025-12-05 23:22:59.620282571 +0000 UTC m=+9650.087590059" observedRunningTime="2025-12-05 23:22:59.980815929 +0000 UTC m=+9650.448123417" watchObservedRunningTime="2025-12-05 23:22:59.98855224 +0000 UTC m=+9650.455859728" Dec 05 23:23:05 crc kubenswrapper[4747]: I1205 23:23:05.461864 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:05 crc kubenswrapper[4747]: I1205 23:23:05.462502 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:05 crc kubenswrapper[4747]: I1205 23:23:05.518551 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.222701 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.223111 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.223202 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.224449 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.224565 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd" gracePeriod=600 Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.847886 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:06 crc kubenswrapper[4747]: I1205 23:23:06.912703 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:23:08 crc kubenswrapper[4747]: I1205 23:23:08.037791 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd" exitCode=0 Dec 05 23:23:08 crc kubenswrapper[4747]: I1205 23:23:08.037856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd"} Dec 05 23:23:08 crc kubenswrapper[4747]: I1205 23:23:08.038109 4747 scope.go:117] "RemoveContainer" containerID="4bc7c29a603addd9a3720f98a66996d32504492324f70ded77dda0f64f7cf72c" Dec 05 23:23:08 crc kubenswrapper[4747]: I1205 23:23:08.038282 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2lzjr" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="registry-server" containerID="cri-o://4c7aeb2514830e0de2b1d508888a8529225b6d8a55af1e754ecd5a2d9ce0a2bd" gracePeriod=2 Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.051511 4747 generic.go:334] "Generic (PLEG): container finished" podID="86691a73-82d9-4cca-a54f-d81201462a76" containerID="4c7aeb2514830e0de2b1d508888a8529225b6d8a55af1e754ecd5a2d9ce0a2bd" exitCode=0 Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.051576 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerDied","Data":"4c7aeb2514830e0de2b1d508888a8529225b6d8a55af1e754ecd5a2d9ce0a2bd"} Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.230320 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.314251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbgw\" (UniqueName: \"kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw\") pod \"86691a73-82d9-4cca-a54f-d81201462a76\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.314401 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content\") pod \"86691a73-82d9-4cca-a54f-d81201462a76\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.314850 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities\") pod \"86691a73-82d9-4cca-a54f-d81201462a76\" (UID: \"86691a73-82d9-4cca-a54f-d81201462a76\") " Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.316297 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities" (OuterVolumeSpecName: "utilities") pod "86691a73-82d9-4cca-a54f-d81201462a76" (UID: "86691a73-82d9-4cca-a54f-d81201462a76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.322098 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw" (OuterVolumeSpecName: "kube-api-access-dtbgw") pod "86691a73-82d9-4cca-a54f-d81201462a76" (UID: "86691a73-82d9-4cca-a54f-d81201462a76"). InnerVolumeSpecName "kube-api-access-dtbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.364506 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86691a73-82d9-4cca-a54f-d81201462a76" (UID: "86691a73-82d9-4cca-a54f-d81201462a76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.418192 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.418234 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbgw\" (UniqueName: \"kubernetes.io/projected/86691a73-82d9-4cca-a54f-d81201462a76-kube-api-access-dtbgw\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:09 crc kubenswrapper[4747]: I1205 23:23:09.418243 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86691a73-82d9-4cca-a54f-d81201462a76-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.064517 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e"} Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.067355 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lzjr" event={"ID":"86691a73-82d9-4cca-a54f-d81201462a76","Type":"ContainerDied","Data":"27330e2ca5b0b935f313f87c895c7120bfe26110906cf4b645d1a32a19e1c193"} Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.067399 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lzjr" Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.067406 4747 scope.go:117] "RemoveContainer" containerID="4c7aeb2514830e0de2b1d508888a8529225b6d8a55af1e754ecd5a2d9ce0a2bd" Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.087734 4747 scope.go:117] "RemoveContainer" containerID="a5e453c1cf05badff7bb6b5105e9f14ac9aa2624d1a3850b5465827fc8adb84e" Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.116927 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.125382 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2lzjr"] Dec 05 23:23:10 crc kubenswrapper[4747]: I1205 23:23:10.137440 4747 scope.go:117] "RemoveContainer" containerID="0f3d81db0f20783d10d2ed487aa1e8e12002c8a7fcc32467dfde6ba4682bdd6f" Dec 05 23:23:11 crc kubenswrapper[4747]: I1205 23:23:11.862661 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86691a73-82d9-4cca-a54f-d81201462a76" path="/var/lib/kubelet/pods/86691a73-82d9-4cca-a54f-d81201462a76/volumes" Dec 05 23:23:15 crc kubenswrapper[4747]: E1205 23:23:15.946066 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache]" Dec 05 23:23:26 crc kubenswrapper[4747]: E1205 23:23:26.204101 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache]" Dec 05 23:23:36 crc kubenswrapper[4747]: E1205 23:23:36.482568 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache]" Dec 05 23:23:46 crc kubenswrapper[4747]: E1205 23:23:46.747302 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache]" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.673431 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:23:56 crc kubenswrapper[4747]: E1205 23:23:56.674305 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="extract-utilities" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.674318 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="extract-utilities" Dec 05 23:23:56 crc kubenswrapper[4747]: E1205 23:23:56.674351 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="extract-content" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.674357 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="extract-content" Dec 05 23:23:56 crc kubenswrapper[4747]: E1205 23:23:56.674377 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="registry-server" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.674384 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="registry-server" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.674569 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="86691a73-82d9-4cca-a54f-d81201462a76" containerName="registry-server" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.676124 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.707574 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.854390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.854515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffm8\" (UniqueName: \"kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.854597 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.956053 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.956210 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffm8\" (UniqueName: \"kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.956268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.956550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.957538 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: I1205 23:23:56.982833 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffm8\" (UniqueName: \"kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8\") pod \"redhat-operators-tv74m\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:56 crc kubenswrapper[4747]: E1205 23:23:56.999715 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache]" Dec 05 23:23:57 crc kubenswrapper[4747]: I1205 23:23:57.010909 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:23:57 crc kubenswrapper[4747]: I1205 23:23:57.507357 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:23:58 crc kubenswrapper[4747]: I1205 23:23:58.153276 4747 generic.go:334] "Generic (PLEG): container finished" podID="62fce55e-4d10-410d-839a-ac301e5d1190" containerID="8b703f129343b5e14672a78c595d318f486d6503695107c48d5b8316a209f483" exitCode=0 Dec 05 23:23:58 crc kubenswrapper[4747]: I1205 23:23:58.153573 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerDied","Data":"8b703f129343b5e14672a78c595d318f486d6503695107c48d5b8316a209f483"} Dec 05 23:23:58 crc kubenswrapper[4747]: I1205 23:23:58.153620 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerStarted","Data":"51df3bf720c4a3d73ac14e4345a001524deebb69b8dfab2a42a8047ec49398fe"} Dec 05 23:24:00 crc kubenswrapper[4747]: I1205 23:24:00.183531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerStarted","Data":"9bf5921f05a0d37a8ac1b202cf41814192255747a0dde36019da6a4d9b1d32fe"} Dec 05 23:24:07 crc kubenswrapper[4747]: I1205 23:24:07.261830 4747 generic.go:334] "Generic (PLEG): container finished" podID="62fce55e-4d10-410d-839a-ac301e5d1190" containerID="9bf5921f05a0d37a8ac1b202cf41814192255747a0dde36019da6a4d9b1d32fe" exitCode=0 Dec 05 23:24:07 crc kubenswrapper[4747]: I1205 23:24:07.261890 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerDied","Data":"9bf5921f05a0d37a8ac1b202cf41814192255747a0dde36019da6a4d9b1d32fe"} Dec 05 23:24:07 crc kubenswrapper[4747]: E1205 23:24:07.285472 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86691a73_82d9_4cca_a54f_d81201462a76.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62fce55e_4d10_410d_839a_ac301e5d1190.slice/crio-9bf5921f05a0d37a8ac1b202cf41814192255747a0dde36019da6a4d9b1d32fe.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:24:08 crc kubenswrapper[4747]: I1205 23:24:08.272465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerStarted","Data":"b91438c8075f01fbdc30fab0833a0c968f2a1d1001d269493ca13f1eab9d9cf4"} Dec 05 23:24:08 crc kubenswrapper[4747]: I1205 23:24:08.292025 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tv74m" podStartSLOduration=2.7726360960000003 podStartE2EDuration="12.29200674s" podCreationTimestamp="2025-12-05 23:23:56 +0000 UTC" firstStartedPulling="2025-12-05 23:23:58.156487115 +0000 UTC m=+9708.623794603" lastFinishedPulling="2025-12-05 23:24:07.675857739 +0000 UTC m=+9718.143165247" observedRunningTime="2025-12-05 23:24:08.288613976 +0000 UTC m=+9718.755921464" watchObservedRunningTime="2025-12-05 23:24:08.29200674 +0000 UTC m=+9718.759314228" Dec 05 23:24:17 crc kubenswrapper[4747]: I1205 23:24:17.012051 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:17 crc kubenswrapper[4747]: I1205 23:24:17.012731 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:17 crc kubenswrapper[4747]: I1205 23:24:17.071949 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:17 crc kubenswrapper[4747]: I1205 23:24:17.457833 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:17 crc kubenswrapper[4747]: I1205 23:24:17.511893 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:24:19 crc kubenswrapper[4747]: I1205 23:24:19.403743 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tv74m" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="registry-server" containerID="cri-o://b91438c8075f01fbdc30fab0833a0c968f2a1d1001d269493ca13f1eab9d9cf4" gracePeriod=2 Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.417425 4747 generic.go:334] "Generic (PLEG): container finished" podID="62fce55e-4d10-410d-839a-ac301e5d1190" containerID="b91438c8075f01fbdc30fab0833a0c968f2a1d1001d269493ca13f1eab9d9cf4" exitCode=0 Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.417513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerDied","Data":"b91438c8075f01fbdc30fab0833a0c968f2a1d1001d269493ca13f1eab9d9cf4"} Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.721911 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.838987 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content\") pod \"62fce55e-4d10-410d-839a-ac301e5d1190\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.839040 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities\") pod \"62fce55e-4d10-410d-839a-ac301e5d1190\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.839064 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ffm8\" (UniqueName: \"kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8\") pod \"62fce55e-4d10-410d-839a-ac301e5d1190\" (UID: \"62fce55e-4d10-410d-839a-ac301e5d1190\") " Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.841025 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities" (OuterVolumeSpecName: "utilities") pod "62fce55e-4d10-410d-839a-ac301e5d1190" (UID: "62fce55e-4d10-410d-839a-ac301e5d1190"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.842189 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.845528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8" (OuterVolumeSpecName: "kube-api-access-2ffm8") pod "62fce55e-4d10-410d-839a-ac301e5d1190" (UID: "62fce55e-4d10-410d-839a-ac301e5d1190"). InnerVolumeSpecName "kube-api-access-2ffm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.945919 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ffm8\" (UniqueName: \"kubernetes.io/projected/62fce55e-4d10-410d-839a-ac301e5d1190-kube-api-access-2ffm8\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4747]: I1205 23:24:20.953024 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62fce55e-4d10-410d-839a-ac301e5d1190" (UID: "62fce55e-4d10-410d-839a-ac301e5d1190"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.048092 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fce55e-4d10-410d-839a-ac301e5d1190-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.431851 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tv74m" event={"ID":"62fce55e-4d10-410d-839a-ac301e5d1190","Type":"ContainerDied","Data":"51df3bf720c4a3d73ac14e4345a001524deebb69b8dfab2a42a8047ec49398fe"} Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.431910 4747 scope.go:117] "RemoveContainer" containerID="b91438c8075f01fbdc30fab0833a0c968f2a1d1001d269493ca13f1eab9d9cf4" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.431994 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tv74m" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.480543 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.488315 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tv74m"] Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.493238 4747 scope.go:117] "RemoveContainer" containerID="9bf5921f05a0d37a8ac1b202cf41814192255747a0dde36019da6a4d9b1d32fe" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.537884 4747 scope.go:117] "RemoveContainer" containerID="8b703f129343b5e14672a78c595d318f486d6503695107c48d5b8316a209f483" Dec 05 23:24:21 crc kubenswrapper[4747]: I1205 23:24:21.865890 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" path="/var/lib/kubelet/pods/62fce55e-4d10-410d-839a-ac301e5d1190/volumes" Dec 05 23:25:36 crc kubenswrapper[4747]: I1205 23:25:36.221661 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:25:36 crc kubenswrapper[4747]: I1205 23:25:36.223760 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:26:06 crc kubenswrapper[4747]: I1205 23:26:06.222434 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:26:06 crc kubenswrapper[4747]: I1205 23:26:06.223028 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.221927 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.222567 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.222628 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.223454 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.223509 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" gracePeriod=600 Dec 05 23:26:36 crc kubenswrapper[4747]: E1205 23:26:36.359539 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.983419 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" exitCode=0 Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.983462 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e"} Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.983495 4747 scope.go:117] "RemoveContainer" containerID="70f1a4978d91903a09990a1a3faa7c6b55da64f90b48d275073c095c42aaf7bd" Dec 05 23:26:36 crc kubenswrapper[4747]: I1205 23:26:36.984659 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:26:36 crc kubenswrapper[4747]: E1205 23:26:36.985238 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:26:50 crc kubenswrapper[4747]: I1205 23:26:50.839618 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:26:50 crc kubenswrapper[4747]: E1205 23:26:50.840947 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:27:05 crc kubenswrapper[4747]: I1205 23:27:05.840417 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:27:05 crc kubenswrapper[4747]: E1205 23:27:05.841872 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:27:20 crc kubenswrapper[4747]: I1205 23:27:20.841883 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:27:20 crc kubenswrapper[4747]: E1205 23:27:20.843278 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:27:35 crc kubenswrapper[4747]: I1205 23:27:35.839828 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:27:35 crc kubenswrapper[4747]: E1205 23:27:35.840752 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:27:45 crc kubenswrapper[4747]: I1205 23:27:45.747420 4747 generic.go:334] "Generic (PLEG): container finished" podID="29780f9c-1925-4bc3-9a1d-24f9723dbcb9" containerID="8405bb1a38771702cdee21317f26972b041a3c71ad07466ca9037a3cc41b7da9" exitCode=0 Dec 05 23:27:45 crc kubenswrapper[4747]: I1205 23:27:45.747534 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" event={"ID":"29780f9c-1925-4bc3-9a1d-24f9723dbcb9","Type":"ContainerDied","Data":"8405bb1a38771702cdee21317f26972b041a3c71ad07466ca9037a3cc41b7da9"} Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.306140 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466008 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466184 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466434 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466647 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466692 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88sr2\" (UniqueName: \"kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466758 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.466827 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle\") pod \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\" (UID: \"29780f9c-1925-4bc3-9a1d-24f9723dbcb9\") " Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.474925 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2" (OuterVolumeSpecName: "kube-api-access-88sr2") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "kube-api-access-88sr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.474925 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.497223 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.498743 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.500615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.508901 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.512937 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.516318 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory" (OuterVolumeSpecName: "inventory") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.519647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29780f9c-1925-4bc3-9a1d-24f9723dbcb9" (UID: "29780f9c-1925-4bc3-9a1d-24f9723dbcb9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569300 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569355 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569367 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569380 4747 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569393 4747 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569407 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88sr2\" (UniqueName: \"kubernetes.io/projected/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-kube-api-access-88sr2\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569420 4747 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569432 4747 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.569444 4747 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29780f9c-1925-4bc3-9a1d-24f9723dbcb9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.766316 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" event={"ID":"29780f9c-1925-4bc3-9a1d-24f9723dbcb9","Type":"ContainerDied","Data":"511be5b0d346afb6ddc327e0b44155902dec094e38023d1066ca608f3c15a943"} Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.766355 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511be5b0d346afb6ddc327e0b44155902dec094e38023d1066ca608f3c15a943" Dec 05 23:27:47 crc kubenswrapper[4747]: I1205 23:27:47.766440 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m" Dec 05 23:27:50 crc kubenswrapper[4747]: I1205 23:27:50.840373 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:27:50 crc kubenswrapper[4747]: E1205 23:27:50.841148 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:05 crc kubenswrapper[4747]: I1205 23:28:05.840724 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:28:05 crc kubenswrapper[4747]: E1205 23:28:05.842808 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:20 crc kubenswrapper[4747]: I1205 23:28:20.840854 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:28:20 crc kubenswrapper[4747]: E1205 23:28:20.841826 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:31 crc kubenswrapper[4747]: I1205 23:28:31.843562 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:28:31 crc kubenswrapper[4747]: E1205 23:28:31.844363 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:45 crc kubenswrapper[4747]: I1205 23:28:45.840548 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:28:45 crc kubenswrapper[4747]: E1205 23:28:45.842816 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.195617 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:28:57 crc kubenswrapper[4747]: E1205 23:28:57.197152 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="extract-utilities" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197182 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="extract-utilities" Dec 05 23:28:57 crc kubenswrapper[4747]: E1205 23:28:57.197231 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="registry-server" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197249 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="registry-server" Dec 05 23:28:57 crc kubenswrapper[4747]: E1205 23:28:57.197280 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="extract-content" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197294 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="extract-content" Dec 05 23:28:57 crc kubenswrapper[4747]: E1205 23:28:57.197333 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29780f9c-1925-4bc3-9a1d-24f9723dbcb9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197348 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="29780f9c-1925-4bc3-9a1d-24f9723dbcb9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197781 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fce55e-4d10-410d-839a-ac301e5d1190" containerName="registry-server" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.197841 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="29780f9c-1925-4bc3-9a1d-24f9723dbcb9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.201070 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.212941 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.242498 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.243895 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.244006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29dcs\" (UniqueName: \"kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.345734 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.345814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.345878 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29dcs\" (UniqueName: \"kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.346448 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.346480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.381598 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dcs\" (UniqueName: \"kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs\") pod \"redhat-marketplace-6k54z\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.529505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:28:57 crc kubenswrapper[4747]: I1205 23:28:57.840493 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:28:57 crc kubenswrapper[4747]: E1205 23:28:57.840918 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:28:58 crc kubenswrapper[4747]: I1205 23:28:58.029755 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:28:58 crc kubenswrapper[4747]: W1205 23:28:58.034758 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20400dcb_78f6_4d79_8ced_a26762140a8e.slice/crio-abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c WatchSource:0}: Error finding container abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c: Status 404 returned error can't find the container with id abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c Dec 05 23:28:58 crc kubenswrapper[4747]: I1205 23:28:58.522610 4747 generic.go:334] "Generic (PLEG): container finished" podID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerID="c9afd47bcbae299578e0e13ab3ea7f2f4b5eae310d36dcf61aea7de7589f3e38" exitCode=0 Dec 05 23:28:58 crc kubenswrapper[4747]: I1205 23:28:58.522728 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerDied","Data":"c9afd47bcbae299578e0e13ab3ea7f2f4b5eae310d36dcf61aea7de7589f3e38"} Dec 05 23:28:58 crc kubenswrapper[4747]: I1205 23:28:58.523020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerStarted","Data":"abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c"} Dec 05 23:28:58 crc kubenswrapper[4747]: I1205 23:28:58.526406 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:28:59 crc kubenswrapper[4747]: I1205 23:28:59.535992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerStarted","Data":"d5d82123940ecfef26236c088bf55333c6026dd64c1975c0111496c9ed76af48"} Dec 05 23:29:00 crc kubenswrapper[4747]: I1205 23:29:00.551215 4747 generic.go:334] "Generic (PLEG): container finished" podID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerID="d5d82123940ecfef26236c088bf55333c6026dd64c1975c0111496c9ed76af48" exitCode=0 Dec 05 23:29:00 crc kubenswrapper[4747]: I1205 23:29:00.551317 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerDied","Data":"d5d82123940ecfef26236c088bf55333c6026dd64c1975c0111496c9ed76af48"} Dec 05 23:29:01 crc kubenswrapper[4747]: I1205 23:29:01.562264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerStarted","Data":"d586e773e2249a234cbe54991bc80debb9f31fbe43a0b10ba508b5e47d568077"} Dec 05 23:29:01 crc kubenswrapper[4747]: I1205 23:29:01.579936 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6k54z" podStartSLOduration=2.153171548 podStartE2EDuration="4.579921539s" podCreationTimestamp="2025-12-05 23:28:57 +0000 UTC" firstStartedPulling="2025-12-05 23:28:58.526192129 +0000 UTC m=+10008.993499617" lastFinishedPulling="2025-12-05 23:29:00.95294212 +0000 UTC m=+10011.420249608" observedRunningTime="2025-12-05 23:29:01.579266582 +0000 UTC m=+10012.046574070" watchObservedRunningTime="2025-12-05 23:29:01.579921539 +0000 UTC m=+10012.047229027" Dec 05 23:29:07 crc kubenswrapper[4747]: I1205 23:29:07.530051 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:07 crc kubenswrapper[4747]: I1205 23:29:07.530596 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:07 crc kubenswrapper[4747]: I1205 23:29:07.604714 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:07 crc kubenswrapper[4747]: I1205 23:29:07.678090 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:07 crc kubenswrapper[4747]: I1205 23:29:07.856321 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:29:09 crc kubenswrapper[4747]: I1205 23:29:09.641764 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6k54z" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="registry-server" containerID="cri-o://d586e773e2249a234cbe54991bc80debb9f31fbe43a0b10ba508b5e47d568077" gracePeriod=2 Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.653925 4747 generic.go:334] "Generic (PLEG): container finished" podID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerID="d586e773e2249a234cbe54991bc80debb9f31fbe43a0b10ba508b5e47d568077" exitCode=0 Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.654008 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerDied","Data":"d586e773e2249a234cbe54991bc80debb9f31fbe43a0b10ba508b5e47d568077"} Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.654572 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6k54z" event={"ID":"20400dcb-78f6-4d79-8ced-a26762140a8e","Type":"ContainerDied","Data":"abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c"} Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.654607 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc1d31b9d10c4b4c51e4a0eabbb7d5e087937eb700d0a9fbb3b7eb67474fb8c" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.670320 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.838822 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content\") pod \"20400dcb-78f6-4d79-8ced-a26762140a8e\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.839246 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29dcs\" (UniqueName: \"kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs\") pod \"20400dcb-78f6-4d79-8ced-a26762140a8e\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.839356 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities\") pod \"20400dcb-78f6-4d79-8ced-a26762140a8e\" (UID: \"20400dcb-78f6-4d79-8ced-a26762140a8e\") " Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.840109 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities" (OuterVolumeSpecName: "utilities") pod "20400dcb-78f6-4d79-8ced-a26762140a8e" (UID: "20400dcb-78f6-4d79-8ced-a26762140a8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.840364 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.850969 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs" (OuterVolumeSpecName: "kube-api-access-29dcs") pod "20400dcb-78f6-4d79-8ced-a26762140a8e" (UID: "20400dcb-78f6-4d79-8ced-a26762140a8e"). InnerVolumeSpecName "kube-api-access-29dcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.864050 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20400dcb-78f6-4d79-8ced-a26762140a8e" (UID: "20400dcb-78f6-4d79-8ced-a26762140a8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.943004 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20400dcb-78f6-4d79-8ced-a26762140a8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:29:10 crc kubenswrapper[4747]: I1205 23:29:10.943036 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29dcs\" (UniqueName: \"kubernetes.io/projected/20400dcb-78f6-4d79-8ced-a26762140a8e-kube-api-access-29dcs\") on node \"crc\" DevicePath \"\"" Dec 05 23:29:11 crc kubenswrapper[4747]: I1205 23:29:11.664341 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6k54z" Dec 05 23:29:11 crc kubenswrapper[4747]: I1205 23:29:11.717051 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:29:11 crc kubenswrapper[4747]: I1205 23:29:11.730351 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6k54z"] Dec 05 23:29:11 crc kubenswrapper[4747]: I1205 23:29:11.839765 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:29:11 crc kubenswrapper[4747]: E1205 23:29:11.840234 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:29:11 crc kubenswrapper[4747]: I1205 23:29:11.855296 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" path="/var/lib/kubelet/pods/20400dcb-78f6-4d79-8ced-a26762140a8e/volumes" Dec 05 23:29:22 crc kubenswrapper[4747]: I1205 23:29:22.839756 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:29:22 crc kubenswrapper[4747]: E1205 23:29:22.840531 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:29:32 crc kubenswrapper[4747]: I1205 23:29:32.481904 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 23:29:32 crc kubenswrapper[4747]: I1205 23:29:32.483950 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="b7e91ce7-cd06-4b55-8544-096e74925dca" containerName="adoption" containerID="cri-o://c54f925012dec54cc1d3123accfc64c0f8edc1343eb74cf1f0f2dcde19a4071d" gracePeriod=30 Dec 05 23:29:36 crc kubenswrapper[4747]: I1205 23:29:36.840341 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:29:36 crc kubenswrapper[4747]: E1205 23:29:36.842277 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:29:48 crc kubenswrapper[4747]: I1205 23:29:48.840128 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:29:48 crc kubenswrapper[4747]: E1205 23:29:48.841131 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.182990 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4"] Dec 05 23:30:00 crc kubenswrapper[4747]: E1205 23:30:00.183982 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="extract-utilities" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.183996 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="extract-utilities" Dec 05 23:30:00 crc kubenswrapper[4747]: E1205 23:30:00.184019 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="registry-server" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.184025 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="registry-server" Dec 05 23:30:00 crc kubenswrapper[4747]: E1205 23:30:00.184047 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="extract-content" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.184054 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="extract-content" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.184270 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="20400dcb-78f6-4d79-8ced-a26762140a8e" containerName="registry-server" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.185038 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.187709 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.187750 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.193335 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4"] Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.271560 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57bh\" (UniqueName: \"kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.271626 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.271700 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.374119 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57bh\" (UniqueName: \"kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.374462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.374577 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.375349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.380071 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:00 crc kubenswrapper[4747]: I1205 23:30:00.839431 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:30:00 crc kubenswrapper[4747]: E1205 23:30:00.840395 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:30:01 crc kubenswrapper[4747]: I1205 23:30:01.182427 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57bh\" (UniqueName: \"kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh\") pod \"collect-profiles-29416290-4z5z4\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:01 crc kubenswrapper[4747]: I1205 23:30:01.439458 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:01 crc kubenswrapper[4747]: I1205 23:30:01.895972 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4"] Dec 05 23:30:02 crc kubenswrapper[4747]: I1205 23:30:02.258134 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" event={"ID":"ddee0d54-fedd-4fb7-9365-0116ec61e549","Type":"ContainerStarted","Data":"3bccae2160e7d5ae299a0c58cbca4425c715cdf2acc15b3553bbb0aa76e57316"} Dec 05 23:30:02 crc kubenswrapper[4747]: I1205 23:30:02.258429 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" event={"ID":"ddee0d54-fedd-4fb7-9365-0116ec61e549","Type":"ContainerStarted","Data":"90fdaafaf1129b7ca6f6c2c0f5ff5f6cd96c6920b6e6f58c21511609c1d90452"} Dec 05 23:30:02 crc kubenswrapper[4747]: I1205 23:30:02.278671 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" podStartSLOduration=2.278572789 podStartE2EDuration="2.278572789s" podCreationTimestamp="2025-12-05 23:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:30:02.271490394 +0000 UTC m=+10072.738797892" watchObservedRunningTime="2025-12-05 23:30:02.278572789 +0000 UTC m=+10072.745880277" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.290724 4747 generic.go:334] "Generic (PLEG): container finished" podID="b7e91ce7-cd06-4b55-8544-096e74925dca" containerID="c54f925012dec54cc1d3123accfc64c0f8edc1343eb74cf1f0f2dcde19a4071d" exitCode=137 Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.291038 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b7e91ce7-cd06-4b55-8544-096e74925dca","Type":"ContainerDied","Data":"c54f925012dec54cc1d3123accfc64c0f8edc1343eb74cf1f0f2dcde19a4071d"} Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.294664 4747 generic.go:334] "Generic (PLEG): container finished" podID="ddee0d54-fedd-4fb7-9365-0116ec61e549" containerID="3bccae2160e7d5ae299a0c58cbca4425c715cdf2acc15b3553bbb0aa76e57316" exitCode=0 Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.294717 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" event={"ID":"ddee0d54-fedd-4fb7-9365-0116ec61e549","Type":"ContainerDied","Data":"3bccae2160e7d5ae299a0c58cbca4425c715cdf2acc15b3553bbb0aa76e57316"} Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.420148 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.550703 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzng\" (UniqueName: \"kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng\") pod \"b7e91ce7-cd06-4b55-8544-096e74925dca\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.551859 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") pod \"b7e91ce7-cd06-4b55-8544-096e74925dca\" (UID: \"b7e91ce7-cd06-4b55-8544-096e74925dca\") " Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.557451 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng" (OuterVolumeSpecName: "kube-api-access-mmzng") pod "b7e91ce7-cd06-4b55-8544-096e74925dca" (UID: "b7e91ce7-cd06-4b55-8544-096e74925dca"). InnerVolumeSpecName "kube-api-access-mmzng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.575309 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290" (OuterVolumeSpecName: "mariadb-data") pod "b7e91ce7-cd06-4b55-8544-096e74925dca" (UID: "b7e91ce7-cd06-4b55-8544-096e74925dca"). InnerVolumeSpecName "pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.654290 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzng\" (UniqueName: \"kubernetes.io/projected/b7e91ce7-cd06-4b55-8544-096e74925dca-kube-api-access-mmzng\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.654345 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") on node \"crc\" " Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.679084 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.679296 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290") on node "crc" Dec 05 23:30:03 crc kubenswrapper[4747]: I1205 23:30:03.755740 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3eedff9-ec6e-40c7-adae-fcc04c831290\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.310560 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.311883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b7e91ce7-cd06-4b55-8544-096e74925dca","Type":"ContainerDied","Data":"a3d07c686820eec840fa521463955d6781c73cfe9952da1c8e7045058955f956"} Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.311930 4747 scope.go:117] "RemoveContainer" containerID="c54f925012dec54cc1d3123accfc64c0f8edc1343eb74cf1f0f2dcde19a4071d" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.340977 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.354142 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.808103 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.882825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z57bh\" (UniqueName: \"kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh\") pod \"ddee0d54-fedd-4fb7-9365-0116ec61e549\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.882895 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume\") pod \"ddee0d54-fedd-4fb7-9365-0116ec61e549\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.882975 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume\") pod \"ddee0d54-fedd-4fb7-9365-0116ec61e549\" (UID: \"ddee0d54-fedd-4fb7-9365-0116ec61e549\") " Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.883857 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume" (OuterVolumeSpecName: "config-volume") pod "ddee0d54-fedd-4fb7-9365-0116ec61e549" (UID: "ddee0d54-fedd-4fb7-9365-0116ec61e549"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.889779 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ddee0d54-fedd-4fb7-9365-0116ec61e549" (UID: "ddee0d54-fedd-4fb7-9365-0116ec61e549"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.889851 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh" (OuterVolumeSpecName: "kube-api-access-z57bh") pod "ddee0d54-fedd-4fb7-9365-0116ec61e549" (UID: "ddee0d54-fedd-4fb7-9365-0116ec61e549"). InnerVolumeSpecName "kube-api-access-z57bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.959609 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd"] Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.972557 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416245-c92qd"] Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.994018 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z57bh\" (UniqueName: \"kubernetes.io/projected/ddee0d54-fedd-4fb7-9365-0116ec61e549-kube-api-access-z57bh\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.994084 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddee0d54-fedd-4fb7-9365-0116ec61e549-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:04 crc kubenswrapper[4747]: I1205 23:30:04.994098 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddee0d54-fedd-4fb7-9365-0116ec61e549-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.018566 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.320025 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.320024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-4z5z4" event={"ID":"ddee0d54-fedd-4fb7-9365-0116ec61e549","Type":"ContainerDied","Data":"90fdaafaf1129b7ca6f6c2c0f5ff5f6cd96c6920b6e6f58c21511609c1d90452"} Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.321568 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90fdaafaf1129b7ca6f6c2c0f5ff5f6cd96c6920b6e6f58c21511609c1d90452" Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.327936 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="d8133d84-9dd2-4780-9414-ae4e7b78884d" containerName="adoption" containerID="cri-o://69fedae6faa4293d62267877273722542dfaffcdf5d29143032a646e359ee267" gracePeriod=30 Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.863917 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87085a8d-5db8-44df-9b7f-8dfa467fca76" path="/var/lib/kubelet/pods/87085a8d-5db8-44df-9b7f-8dfa467fca76/volumes" Dec 05 23:30:05 crc kubenswrapper[4747]: I1205 23:30:05.864693 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e91ce7-cd06-4b55-8544-096e74925dca" path="/var/lib/kubelet/pods/b7e91ce7-cd06-4b55-8544-096e74925dca/volumes" Dec 05 23:30:13 crc kubenswrapper[4747]: I1205 23:30:13.840397 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:30:13 crc kubenswrapper[4747]: E1205 23:30:13.841206 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:30:25 crc kubenswrapper[4747]: I1205 23:30:25.840652 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:30:25 crc kubenswrapper[4747]: E1205 23:30:25.841550 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.650500 4747 generic.go:334] "Generic (PLEG): container finished" podID="d8133d84-9dd2-4780-9414-ae4e7b78884d" containerID="69fedae6faa4293d62267877273722542dfaffcdf5d29143032a646e359ee267" exitCode=137 Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.650577 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d8133d84-9dd2-4780-9414-ae4e7b78884d","Type":"ContainerDied","Data":"69fedae6faa4293d62267877273722542dfaffcdf5d29143032a646e359ee267"} Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.892698 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.980092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") pod \"d8133d84-9dd2-4780-9414-ae4e7b78884d\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.981331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdfxk\" (UniqueName: \"kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk\") pod \"d8133d84-9dd2-4780-9414-ae4e7b78884d\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.982164 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert\") pod \"d8133d84-9dd2-4780-9414-ae4e7b78884d\" (UID: \"d8133d84-9dd2-4780-9414-ae4e7b78884d\") " Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.991235 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk" (OuterVolumeSpecName: "kube-api-access-kdfxk") pod "d8133d84-9dd2-4780-9414-ae4e7b78884d" (UID: "d8133d84-9dd2-4780-9414-ae4e7b78884d"). InnerVolumeSpecName "kube-api-access-kdfxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4747]: I1205 23:30:35.991536 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "d8133d84-9dd2-4780-9414-ae4e7b78884d" (UID: "d8133d84-9dd2-4780-9414-ae4e7b78884d"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.013702 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf" (OuterVolumeSpecName: "ovn-data") pod "d8133d84-9dd2-4780-9414-ae4e7b78884d" (UID: "d8133d84-9dd2-4780-9414-ae4e7b78884d"). InnerVolumeSpecName "pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.084487 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d8133d84-9dd2-4780-9414-ae4e7b78884d-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.084539 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") on node \"crc\" " Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.084551 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdfxk\" (UniqueName: \"kubernetes.io/projected/d8133d84-9dd2-4780-9414-ae4e7b78884d-kube-api-access-kdfxk\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.110764 4747 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.110894 4747 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf") on node "crc" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.186647 4747 reconciler_common.go:293] "Volume detached for volume \"pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9727ffb3-34ff-44c9-8085-02634c2fadaf\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.664703 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d8133d84-9dd2-4780-9414-ae4e7b78884d","Type":"ContainerDied","Data":"dba6ecf5702fd22446b3056fd9564f54c4443e9830a6f0aea4b2a2df23c606c4"} Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.664800 4747 scope.go:117] "RemoveContainer" containerID="69fedae6faa4293d62267877273722542dfaffcdf5d29143032a646e359ee267" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.664749 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.720135 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.734469 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 23:30:36 crc kubenswrapper[4747]: I1205 23:30:36.840485 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:30:36 crc kubenswrapper[4747]: E1205 23:30:36.841534 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:30:37 crc kubenswrapper[4747]: I1205 23:30:37.856994 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8133d84-9dd2-4780-9414-ae4e7b78884d" path="/var/lib/kubelet/pods/d8133d84-9dd2-4780-9414-ae4e7b78884d/volumes" Dec 05 23:30:39 crc kubenswrapper[4747]: I1205 23:30:39.975186 4747 scope.go:117] "RemoveContainer" containerID="cfdda23a8c197df926091239b32938e66582a660d02fcc6d06f5723ff024d112" Dec 05 23:30:50 crc kubenswrapper[4747]: I1205 23:30:50.840881 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:30:50 crc kubenswrapper[4747]: E1205 23:30:50.842183 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:31:01 crc kubenswrapper[4747]: I1205 23:31:01.839887 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:31:01 crc kubenswrapper[4747]: E1205 23:31:01.840900 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:31:15 crc kubenswrapper[4747]: I1205 23:31:15.840154 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:31:15 crc kubenswrapper[4747]: E1205 23:31:15.841280 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:31:30 crc kubenswrapper[4747]: I1205 23:31:30.841135 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:31:30 crc kubenswrapper[4747]: E1205 23:31:30.842468 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:31:42 crc kubenswrapper[4747]: I1205 23:31:42.839819 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.257594 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjlqd/must-gather-q8dkt"] Dec 05 23:31:43 crc kubenswrapper[4747]: E1205 23:31:43.258473 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e91ce7-cd06-4b55-8544-096e74925dca" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.258542 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e91ce7-cd06-4b55-8544-096e74925dca" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: E1205 23:31:43.258626 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddee0d54-fedd-4fb7-9365-0116ec61e549" containerName="collect-profiles" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.258692 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddee0d54-fedd-4fb7-9365-0116ec61e549" containerName="collect-profiles" Dec 05 23:31:43 crc kubenswrapper[4747]: E1205 23:31:43.258754 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8133d84-9dd2-4780-9414-ae4e7b78884d" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.258807 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8133d84-9dd2-4780-9414-ae4e7b78884d" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.259061 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddee0d54-fedd-4fb7-9365-0116ec61e549" containerName="collect-profiles" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.259133 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e91ce7-cd06-4b55-8544-096e74925dca" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.259198 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8133d84-9dd2-4780-9414-ae4e7b78884d" containerName="adoption" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.260432 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.261905 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kjlqd"/"default-dockercfg-l45zg" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.262183 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kjlqd"/"openshift-service-ca.crt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.262458 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kjlqd"/"kube-root-ca.crt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.273102 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjlqd/must-gather-q8dkt"] Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.352637 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwh8\" (UniqueName: \"kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.352772 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.453036 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf"} Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.454597 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plwh8\" (UniqueName: \"kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.454685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.455286 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.489281 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwh8\" (UniqueName: \"kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8\") pod \"must-gather-q8dkt\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:43 crc kubenswrapper[4747]: I1205 23:31:43.577516 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:31:44 crc kubenswrapper[4747]: I1205 23:31:44.068703 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjlqd/must-gather-q8dkt"] Dec 05 23:31:44 crc kubenswrapper[4747]: W1205 23:31:44.074185 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ca8613_9c9c_49b1_ade8_e5928fceef4d.slice/crio-b03b398fda61ac104d9fed65f20be90fcf2f0c568a2411949141083d1ede2ce9 WatchSource:0}: Error finding container b03b398fda61ac104d9fed65f20be90fcf2f0c568a2411949141083d1ede2ce9: Status 404 returned error can't find the container with id b03b398fda61ac104d9fed65f20be90fcf2f0c568a2411949141083d1ede2ce9 Dec 05 23:31:44 crc kubenswrapper[4747]: I1205 23:31:44.461827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" event={"ID":"63ca8613-9c9c-49b1-ade8-e5928fceef4d","Type":"ContainerStarted","Data":"b03b398fda61ac104d9fed65f20be90fcf2f0c568a2411949141083d1ede2ce9"} Dec 05 23:31:49 crc kubenswrapper[4747]: I1205 23:31:49.506601 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" event={"ID":"63ca8613-9c9c-49b1-ade8-e5928fceef4d","Type":"ContainerStarted","Data":"a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d"} Dec 05 23:31:49 crc kubenswrapper[4747]: I1205 23:31:49.507231 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" event={"ID":"63ca8613-9c9c-49b1-ade8-e5928fceef4d","Type":"ContainerStarted","Data":"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90"} Dec 05 23:31:49 crc kubenswrapper[4747]: I1205 23:31:49.525031 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" podStartSLOduration=2.380473945 podStartE2EDuration="6.52501177s" podCreationTimestamp="2025-12-05 23:31:43 +0000 UTC" firstStartedPulling="2025-12-05 23:31:44.075949626 +0000 UTC m=+10174.543257114" lastFinishedPulling="2025-12-05 23:31:48.220487451 +0000 UTC m=+10178.687794939" observedRunningTime="2025-12-05 23:31:49.522703383 +0000 UTC m=+10179.990010891" watchObservedRunningTime="2025-12-05 23:31:49.52501177 +0000 UTC m=+10179.992319258" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.692433 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-rzq6v"] Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.703259 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.832317 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.832366 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks95g\" (UniqueName: \"kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.936511 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.936563 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks95g\" (UniqueName: \"kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.936764 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:52 crc kubenswrapper[4747]: I1205 23:31:52.968620 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks95g\" (UniqueName: \"kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g\") pod \"crc-debug-rzq6v\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:53 crc kubenswrapper[4747]: I1205 23:31:53.044072 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:31:53 crc kubenswrapper[4747]: I1205 23:31:53.544893 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" event={"ID":"1039fc26-1c19-45ff-b2a1-92bb39769778","Type":"ContainerStarted","Data":"8a5ad7aa98ae2c814119a83ed05c2256f9396b58c399cd91aac555da5f9b46a7"} Dec 05 23:31:54 crc kubenswrapper[4747]: I1205 23:31:54.976547 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:31:54 crc kubenswrapper[4747]: I1205 23:31:54.986467 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.021481 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.085798 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.085877 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.085982 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkwm\" (UniqueName: \"kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.188614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.188670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.188711 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkwm\" (UniqueName: \"kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.190225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.190443 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.224483 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkwm\" (UniqueName: \"kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm\") pod \"community-operators-7zd76\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:55 crc kubenswrapper[4747]: I1205 23:31:55.345448 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:31:56 crc kubenswrapper[4747]: I1205 23:31:56.028764 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:31:56 crc kubenswrapper[4747]: I1205 23:31:56.576526 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerStarted","Data":"81f360b7cca1fabe3e659cd4fccee3d92150eb75c11a65703f968f1c7e7d346c"} Dec 05 23:31:58 crc kubenswrapper[4747]: I1205 23:31:58.599532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerStarted","Data":"c3b7d5a11d9359920ca10f18b7c34a71492abf3860fdd21e04f2fc55760b1d1e"} Dec 05 23:31:59 crc kubenswrapper[4747]: I1205 23:31:59.610420 4747 generic.go:334] "Generic (PLEG): container finished" podID="01606703-d9f6-4680-8516-9f68cdfa2145" containerID="c3b7d5a11d9359920ca10f18b7c34a71492abf3860fdd21e04f2fc55760b1d1e" exitCode=0 Dec 05 23:31:59 crc kubenswrapper[4747]: I1205 23:31:59.610742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerDied","Data":"c3b7d5a11d9359920ca10f18b7c34a71492abf3860fdd21e04f2fc55760b1d1e"} Dec 05 23:32:06 crc kubenswrapper[4747]: I1205 23:32:06.696815 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerStarted","Data":"353d7bb97cea3d8b807bbcc2c999a4fe292f67e6a7802cbae11060e8d46edc72"} Dec 05 23:32:06 crc kubenswrapper[4747]: I1205 23:32:06.698248 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" event={"ID":"1039fc26-1c19-45ff-b2a1-92bb39769778","Type":"ContainerStarted","Data":"bb453767569d6252587eb0d27c9f725ba8f4cb817c8226ecfaecbf1a4a7a76b4"} Dec 05 23:32:06 crc kubenswrapper[4747]: I1205 23:32:06.759059 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" podStartSLOduration=2.21909763 podStartE2EDuration="14.759038542s" podCreationTimestamp="2025-12-05 23:31:52 +0000 UTC" firstStartedPulling="2025-12-05 23:31:53.107017799 +0000 UTC m=+10183.574325297" lastFinishedPulling="2025-12-05 23:32:05.646958721 +0000 UTC m=+10196.114266209" observedRunningTime="2025-12-05 23:32:06.748357488 +0000 UTC m=+10197.215664986" watchObservedRunningTime="2025-12-05 23:32:06.759038542 +0000 UTC m=+10197.226346030" Dec 05 23:32:08 crc kubenswrapper[4747]: I1205 23:32:08.724494 4747 generic.go:334] "Generic (PLEG): container finished" podID="01606703-d9f6-4680-8516-9f68cdfa2145" containerID="353d7bb97cea3d8b807bbcc2c999a4fe292f67e6a7802cbae11060e8d46edc72" exitCode=0 Dec 05 23:32:08 crc kubenswrapper[4747]: I1205 23:32:08.724857 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerDied","Data":"353d7bb97cea3d8b807bbcc2c999a4fe292f67e6a7802cbae11060e8d46edc72"} Dec 05 23:32:10 crc kubenswrapper[4747]: I1205 23:32:10.773056 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerStarted","Data":"be63a28cfe98acb556657428f769fc6e3783cfaee25d3c4839ca7102c34ca686"} Dec 05 23:32:10 crc kubenswrapper[4747]: I1205 23:32:10.803403 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7zd76" podStartSLOduration=12.83136086 podStartE2EDuration="16.8033787s" podCreationTimestamp="2025-12-05 23:31:54 +0000 UTC" firstStartedPulling="2025-12-05 23:32:05.563545169 +0000 UTC m=+10196.030852657" lastFinishedPulling="2025-12-05 23:32:09.535563019 +0000 UTC m=+10200.002870497" observedRunningTime="2025-12-05 23:32:10.796422068 +0000 UTC m=+10201.263729566" watchObservedRunningTime="2025-12-05 23:32:10.8033787 +0000 UTC m=+10201.270686198" Dec 05 23:32:15 crc kubenswrapper[4747]: I1205 23:32:15.346432 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:15 crc kubenswrapper[4747]: I1205 23:32:15.349264 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:15 crc kubenswrapper[4747]: I1205 23:32:15.447126 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:15 crc kubenswrapper[4747]: I1205 23:32:15.879411 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:16 crc kubenswrapper[4747]: I1205 23:32:16.557276 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:32:17 crc kubenswrapper[4747]: I1205 23:32:17.837711 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7zd76" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="registry-server" containerID="cri-o://be63a28cfe98acb556657428f769fc6e3783cfaee25d3c4839ca7102c34ca686" gracePeriod=2 Dec 05 23:32:18 crc kubenswrapper[4747]: I1205 23:32:18.850243 4747 generic.go:334] "Generic (PLEG): container finished" podID="01606703-d9f6-4680-8516-9f68cdfa2145" containerID="be63a28cfe98acb556657428f769fc6e3783cfaee25d3c4839ca7102c34ca686" exitCode=0 Dec 05 23:32:18 crc kubenswrapper[4747]: I1205 23:32:18.850285 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerDied","Data":"be63a28cfe98acb556657428f769fc6e3783cfaee25d3c4839ca7102c34ca686"} Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.328228 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.488005 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content\") pod \"01606703-d9f6-4680-8516-9f68cdfa2145\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.488067 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwkwm\" (UniqueName: \"kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm\") pod \"01606703-d9f6-4680-8516-9f68cdfa2145\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.488112 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities\") pod \"01606703-d9f6-4680-8516-9f68cdfa2145\" (UID: \"01606703-d9f6-4680-8516-9f68cdfa2145\") " Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.489013 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities" (OuterVolumeSpecName: "utilities") pod "01606703-d9f6-4680-8516-9f68cdfa2145" (UID: "01606703-d9f6-4680-8516-9f68cdfa2145"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.509905 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm" (OuterVolumeSpecName: "kube-api-access-xwkwm") pod "01606703-d9f6-4680-8516-9f68cdfa2145" (UID: "01606703-d9f6-4680-8516-9f68cdfa2145"). InnerVolumeSpecName "kube-api-access-xwkwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.553264 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01606703-d9f6-4680-8516-9f68cdfa2145" (UID: "01606703-d9f6-4680-8516-9f68cdfa2145"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.590139 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.590175 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwkwm\" (UniqueName: \"kubernetes.io/projected/01606703-d9f6-4680-8516-9f68cdfa2145-kube-api-access-xwkwm\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.590185 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01606703-d9f6-4680-8516-9f68cdfa2145-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.876856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zd76" event={"ID":"01606703-d9f6-4680-8516-9f68cdfa2145","Type":"ContainerDied","Data":"81f360b7cca1fabe3e659cd4fccee3d92150eb75c11a65703f968f1c7e7d346c"} Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.876903 4747 scope.go:117] "RemoveContainer" containerID="be63a28cfe98acb556657428f769fc6e3783cfaee25d3c4839ca7102c34ca686" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.876966 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zd76" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.897361 4747 scope.go:117] "RemoveContainer" containerID="353d7bb97cea3d8b807bbcc2c999a4fe292f67e6a7802cbae11060e8d46edc72" Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.914708 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.937387 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7zd76"] Dec 05 23:32:20 crc kubenswrapper[4747]: I1205 23:32:20.944727 4747 scope.go:117] "RemoveContainer" containerID="c3b7d5a11d9359920ca10f18b7c34a71492abf3860fdd21e04f2fc55760b1d1e" Dec 05 23:32:21 crc kubenswrapper[4747]: I1205 23:32:21.851102 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" path="/var/lib/kubelet/pods/01606703-d9f6-4680-8516-9f68cdfa2145/volumes" Dec 05 23:32:54 crc kubenswrapper[4747]: I1205 23:32:54.218487 4747 generic.go:334] "Generic (PLEG): container finished" podID="1039fc26-1c19-45ff-b2a1-92bb39769778" containerID="bb453767569d6252587eb0d27c9f725ba8f4cb817c8226ecfaecbf1a4a7a76b4" exitCode=0 Dec 05 23:32:54 crc kubenswrapper[4747]: I1205 23:32:54.218561 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" event={"ID":"1039fc26-1c19-45ff-b2a1-92bb39769778","Type":"ContainerDied","Data":"bb453767569d6252587eb0d27c9f725ba8f4cb817c8226ecfaecbf1a4a7a76b4"} Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.637117 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.682656 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-rzq6v"] Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.692431 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-rzq6v"] Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.792820 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks95g\" (UniqueName: \"kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g\") pod \"1039fc26-1c19-45ff-b2a1-92bb39769778\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.793022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host\") pod \"1039fc26-1c19-45ff-b2a1-92bb39769778\" (UID: \"1039fc26-1c19-45ff-b2a1-92bb39769778\") " Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.793150 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host" (OuterVolumeSpecName: "host") pod "1039fc26-1c19-45ff-b2a1-92bb39769778" (UID: "1039fc26-1c19-45ff-b2a1-92bb39769778"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.793862 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1039fc26-1c19-45ff-b2a1-92bb39769778-host\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.817882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g" (OuterVolumeSpecName: "kube-api-access-ks95g") pod "1039fc26-1c19-45ff-b2a1-92bb39769778" (UID: "1039fc26-1c19-45ff-b2a1-92bb39769778"). InnerVolumeSpecName "kube-api-access-ks95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.854921 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1039fc26-1c19-45ff-b2a1-92bb39769778" path="/var/lib/kubelet/pods/1039fc26-1c19-45ff-b2a1-92bb39769778/volumes" Dec 05 23:32:55 crc kubenswrapper[4747]: I1205 23:32:55.896191 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks95g\" (UniqueName: \"kubernetes.io/projected/1039fc26-1c19-45ff-b2a1-92bb39769778-kube-api-access-ks95g\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.275215 4747 scope.go:117] "RemoveContainer" containerID="bb453767569d6252587eb0d27c9f725ba8f4cb817c8226ecfaecbf1a4a7a76b4" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.275301 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-rzq6v" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.887702 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-cv4m5"] Dec 05 23:32:56 crc kubenswrapper[4747]: E1205 23:32:56.888501 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="extract-utilities" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888519 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="extract-utilities" Dec 05 23:32:56 crc kubenswrapper[4747]: E1205 23:32:56.888546 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="registry-server" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888554 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="registry-server" Dec 05 23:32:56 crc kubenswrapper[4747]: E1205 23:32:56.888605 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1039fc26-1c19-45ff-b2a1-92bb39769778" containerName="container-00" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888615 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1039fc26-1c19-45ff-b2a1-92bb39769778" containerName="container-00" Dec 05 23:32:56 crc kubenswrapper[4747]: E1205 23:32:56.888635 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="extract-content" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888644 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="extract-content" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888890 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="01606703-d9f6-4680-8516-9f68cdfa2145" containerName="registry-server" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.888923 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1039fc26-1c19-45ff-b2a1-92bb39769778" containerName="container-00" Dec 05 23:32:56 crc kubenswrapper[4747]: I1205 23:32:56.889866 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.021294 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vls7\" (UniqueName: \"kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.021382 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.123232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.123382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.123910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vls7\" (UniqueName: \"kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.142854 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vls7\" (UniqueName: \"kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7\") pod \"crc-debug-cv4m5\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.206751 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:57 crc kubenswrapper[4747]: I1205 23:32:57.307379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" event={"ID":"571e8023-1f20-4465-81b6-c4d62882ac49","Type":"ContainerStarted","Data":"63fb804e46734c5acf5c6ed0547e79499942974a4727fcf57b69db0558b18780"} Dec 05 23:32:58 crc kubenswrapper[4747]: I1205 23:32:58.316697 4747 generic.go:334] "Generic (PLEG): container finished" podID="571e8023-1f20-4465-81b6-c4d62882ac49" containerID="de77907fea1b5b7b4c990bb7edba1078103e861404e871bd7490ce0a13b3d73c" exitCode=0 Dec 05 23:32:58 crc kubenswrapper[4747]: I1205 23:32:58.316795 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" event={"ID":"571e8023-1f20-4465-81b6-c4d62882ac49","Type":"ContainerDied","Data":"de77907fea1b5b7b4c990bb7edba1078103e861404e871bd7490ce0a13b3d73c"} Dec 05 23:32:58 crc kubenswrapper[4747]: I1205 23:32:58.758693 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-cv4m5"] Dec 05 23:32:58 crc kubenswrapper[4747]: I1205 23:32:58.770476 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-cv4m5"] Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.482620 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.573000 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host\") pod \"571e8023-1f20-4465-81b6-c4d62882ac49\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.573454 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vls7\" (UniqueName: \"kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7\") pod \"571e8023-1f20-4465-81b6-c4d62882ac49\" (UID: \"571e8023-1f20-4465-81b6-c4d62882ac49\") " Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.573127 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host" (OuterVolumeSpecName: "host") pod "571e8023-1f20-4465-81b6-c4d62882ac49" (UID: "571e8023-1f20-4465-81b6-c4d62882ac49"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.574057 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/571e8023-1f20-4465-81b6-c4d62882ac49-host\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.578974 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7" (OuterVolumeSpecName: "kube-api-access-8vls7") pod "571e8023-1f20-4465-81b6-c4d62882ac49" (UID: "571e8023-1f20-4465-81b6-c4d62882ac49"). InnerVolumeSpecName "kube-api-access-8vls7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.675756 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vls7\" (UniqueName: \"kubernetes.io/projected/571e8023-1f20-4465-81b6-c4d62882ac49-kube-api-access-8vls7\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.852654 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571e8023-1f20-4465-81b6-c4d62882ac49" path="/var/lib/kubelet/pods/571e8023-1f20-4465-81b6-c4d62882ac49/volumes" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.960113 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-l99bb"] Dec 05 23:32:59 crc kubenswrapper[4747]: E1205 23:32:59.960604 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e8023-1f20-4465-81b6-c4d62882ac49" containerName="container-00" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.960624 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e8023-1f20-4465-81b6-c4d62882ac49" containerName="container-00" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.960905 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="571e8023-1f20-4465-81b6-c4d62882ac49" containerName="container-00" Dec 05 23:32:59 crc kubenswrapper[4747]: I1205 23:32:59.961620 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.084389 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whgx2\" (UniqueName: \"kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.084485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.189106 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whgx2\" (UniqueName: \"kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.189316 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.189976 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.215254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whgx2\" (UniqueName: \"kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2\") pod \"crc-debug-l99bb\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.280348 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:00 crc kubenswrapper[4747]: W1205 23:33:00.310267 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4393da1e_cc48_4993_8d07_b8dd15dbcec6.slice/crio-36f00e2e90659196ae847cdb05984e5e75cb0ec5f87389bde4ef05b427709859 WatchSource:0}: Error finding container 36f00e2e90659196ae847cdb05984e5e75cb0ec5f87389bde4ef05b427709859: Status 404 returned error can't find the container with id 36f00e2e90659196ae847cdb05984e5e75cb0ec5f87389bde4ef05b427709859 Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.340130 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-cv4m5" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.340138 4747 scope.go:117] "RemoveContainer" containerID="de77907fea1b5b7b4c990bb7edba1078103e861404e871bd7490ce0a13b3d73c" Dec 05 23:33:00 crc kubenswrapper[4747]: I1205 23:33:00.343398 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" event={"ID":"4393da1e-cc48-4993-8d07-b8dd15dbcec6","Type":"ContainerStarted","Data":"36f00e2e90659196ae847cdb05984e5e75cb0ec5f87389bde4ef05b427709859"} Dec 05 23:33:01 crc kubenswrapper[4747]: I1205 23:33:01.354474 4747 generic.go:334] "Generic (PLEG): container finished" podID="4393da1e-cc48-4993-8d07-b8dd15dbcec6" containerID="080f261e3afaacaebd24a7aa83ace6f2997b0cd6d0d8d25c6b7e008f3569d480" exitCode=0 Dec 05 23:33:01 crc kubenswrapper[4747]: I1205 23:33:01.354637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" event={"ID":"4393da1e-cc48-4993-8d07-b8dd15dbcec6","Type":"ContainerDied","Data":"080f261e3afaacaebd24a7aa83ace6f2997b0cd6d0d8d25c6b7e008f3569d480"} Dec 05 23:33:01 crc kubenswrapper[4747]: I1205 23:33:01.400084 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-l99bb"] Dec 05 23:33:01 crc kubenswrapper[4747]: I1205 23:33:01.410547 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjlqd/crc-debug-l99bb"] Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.498481 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.640480 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host\") pod \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.640558 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host" (OuterVolumeSpecName: "host") pod "4393da1e-cc48-4993-8d07-b8dd15dbcec6" (UID: "4393da1e-cc48-4993-8d07-b8dd15dbcec6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.640701 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whgx2\" (UniqueName: \"kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2\") pod \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\" (UID: \"4393da1e-cc48-4993-8d07-b8dd15dbcec6\") " Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.641294 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4393da1e-cc48-4993-8d07-b8dd15dbcec6-host\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.645981 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2" (OuterVolumeSpecName: "kube-api-access-whgx2") pod "4393da1e-cc48-4993-8d07-b8dd15dbcec6" (UID: "4393da1e-cc48-4993-8d07-b8dd15dbcec6"). InnerVolumeSpecName "kube-api-access-whgx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:33:02 crc kubenswrapper[4747]: I1205 23:33:02.743642 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whgx2\" (UniqueName: \"kubernetes.io/projected/4393da1e-cc48-4993-8d07-b8dd15dbcec6-kube-api-access-whgx2\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:03 crc kubenswrapper[4747]: I1205 23:33:03.379438 4747 scope.go:117] "RemoveContainer" containerID="080f261e3afaacaebd24a7aa83ace6f2997b0cd6d0d8d25c6b7e008f3569d480" Dec 05 23:33:03 crc kubenswrapper[4747]: I1205 23:33:03.379499 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/crc-debug-l99bb" Dec 05 23:33:03 crc kubenswrapper[4747]: I1205 23:33:03.863306 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4393da1e-cc48-4993-8d07-b8dd15dbcec6" path="/var/lib/kubelet/pods/4393da1e-cc48-4993-8d07-b8dd15dbcec6/volumes" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.191425 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:06 crc kubenswrapper[4747]: E1205 23:33:06.194085 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4393da1e-cc48-4993-8d07-b8dd15dbcec6" containerName="container-00" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.194110 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4393da1e-cc48-4993-8d07-b8dd15dbcec6" containerName="container-00" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.194353 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4393da1e-cc48-4993-8d07-b8dd15dbcec6" containerName="container-00" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.195861 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.208432 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.328318 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.328376 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.328428 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwcc8\" (UniqueName: \"kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.430242 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.430291 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.430342 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwcc8\" (UniqueName: \"kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.430868 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.430895 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.453375 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwcc8\" (UniqueName: \"kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8\") pod \"certified-operators-6xqgx\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:06 crc kubenswrapper[4747]: I1205 23:33:06.516755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:07 crc kubenswrapper[4747]: I1205 23:33:07.038241 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:07 crc kubenswrapper[4747]: W1205 23:33:07.491892 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf66e23_3c10_4de6_84ae_8fc099eaf76e.slice/crio-5aede523099240b257211c75e17e21a4760fbdf094ef89122b997d8e40ac94bd WatchSource:0}: Error finding container 5aede523099240b257211c75e17e21a4760fbdf094ef89122b997d8e40ac94bd: Status 404 returned error can't find the container with id 5aede523099240b257211c75e17e21a4760fbdf094ef89122b997d8e40ac94bd Dec 05 23:33:08 crc kubenswrapper[4747]: I1205 23:33:08.440085 4747 generic.go:334] "Generic (PLEG): container finished" podID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerID="0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6" exitCode=0 Dec 05 23:33:08 crc kubenswrapper[4747]: I1205 23:33:08.440197 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerDied","Data":"0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6"} Dec 05 23:33:08 crc kubenswrapper[4747]: I1205 23:33:08.440660 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerStarted","Data":"5aede523099240b257211c75e17e21a4760fbdf094ef89122b997d8e40ac94bd"} Dec 05 23:33:09 crc kubenswrapper[4747]: I1205 23:33:09.450910 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerStarted","Data":"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf"} Dec 05 23:33:10 crc kubenswrapper[4747]: I1205 23:33:10.460517 4747 generic.go:334] "Generic (PLEG): container finished" podID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerID="06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf" exitCode=0 Dec 05 23:33:10 crc kubenswrapper[4747]: I1205 23:33:10.460625 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerDied","Data":"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf"} Dec 05 23:33:12 crc kubenswrapper[4747]: I1205 23:33:12.492456 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerStarted","Data":"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac"} Dec 05 23:33:12 crc kubenswrapper[4747]: I1205 23:33:12.515067 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6xqgx" podStartSLOduration=3.2368300899999998 podStartE2EDuration="6.515038178s" podCreationTimestamp="2025-12-05 23:33:06 +0000 UTC" firstStartedPulling="2025-12-05 23:33:08.442696279 +0000 UTC m=+10258.910003777" lastFinishedPulling="2025-12-05 23:33:11.720904367 +0000 UTC m=+10262.188211865" observedRunningTime="2025-12-05 23:33:12.511318636 +0000 UTC m=+10262.978626144" watchObservedRunningTime="2025-12-05 23:33:12.515038178 +0000 UTC m=+10262.982345696" Dec 05 23:33:16 crc kubenswrapper[4747]: I1205 23:33:16.517266 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:16 crc kubenswrapper[4747]: I1205 23:33:16.517751 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:16 crc kubenswrapper[4747]: I1205 23:33:16.567096 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:17 crc kubenswrapper[4747]: I1205 23:33:17.612017 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:17 crc kubenswrapper[4747]: I1205 23:33:17.677782 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:19 crc kubenswrapper[4747]: I1205 23:33:19.556613 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6xqgx" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="registry-server" containerID="cri-o://d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac" gracePeriod=2 Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.033111 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.126563 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities\") pod \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.127173 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwcc8\" (UniqueName: \"kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8\") pod \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.127339 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content\") pod \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\" (UID: \"ebf66e23-3c10-4de6-84ae-8fc099eaf76e\") " Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.127896 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities" (OuterVolumeSpecName: "utilities") pod "ebf66e23-3c10-4de6-84ae-8fc099eaf76e" (UID: "ebf66e23-3c10-4de6-84ae-8fc099eaf76e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.128395 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.133261 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8" (OuterVolumeSpecName: "kube-api-access-hwcc8") pod "ebf66e23-3c10-4de6-84ae-8fc099eaf76e" (UID: "ebf66e23-3c10-4de6-84ae-8fc099eaf76e"). InnerVolumeSpecName "kube-api-access-hwcc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.174055 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf66e23-3c10-4de6-84ae-8fc099eaf76e" (UID: "ebf66e23-3c10-4de6-84ae-8fc099eaf76e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.231091 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwcc8\" (UniqueName: \"kubernetes.io/projected/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-kube-api-access-hwcc8\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.231151 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf66e23-3c10-4de6-84ae-8fc099eaf76e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.568659 4747 generic.go:334] "Generic (PLEG): container finished" podID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerID="d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac" exitCode=0 Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.568699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerDied","Data":"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac"} Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.568746 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xqgx" event={"ID":"ebf66e23-3c10-4de6-84ae-8fc099eaf76e","Type":"ContainerDied","Data":"5aede523099240b257211c75e17e21a4760fbdf094ef89122b997d8e40ac94bd"} Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.568771 4747 scope.go:117] "RemoveContainer" containerID="d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.569724 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xqgx" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.602784 4747 scope.go:117] "RemoveContainer" containerID="06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.607981 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.622632 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6xqgx"] Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.626068 4747 scope.go:117] "RemoveContainer" containerID="0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.689008 4747 scope.go:117] "RemoveContainer" containerID="d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac" Dec 05 23:33:20 crc kubenswrapper[4747]: E1205 23:33:20.696987 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac\": container with ID starting with d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac not found: ID does not exist" containerID="d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.697035 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac"} err="failed to get container status \"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac\": rpc error: code = NotFound desc = could not find container \"d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac\": container with ID starting with d121486150e333b7b9c1a2f9a111b248fb8d15c1b8bbce6877b33f20540ddcac not found: ID does not exist" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.697064 4747 scope.go:117] "RemoveContainer" containerID="06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf" Dec 05 23:33:20 crc kubenswrapper[4747]: E1205 23:33:20.697554 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf\": container with ID starting with 06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf not found: ID does not exist" containerID="06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.697644 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf"} err="failed to get container status \"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf\": rpc error: code = NotFound desc = could not find container \"06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf\": container with ID starting with 06460801975f18568dfb0d19fe48b31e6ef052f658753cd13f7687e16d36ddbf not found: ID does not exist" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.697673 4747 scope.go:117] "RemoveContainer" containerID="0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6" Dec 05 23:33:20 crc kubenswrapper[4747]: E1205 23:33:20.697973 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6\": container with ID starting with 0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6 not found: ID does not exist" containerID="0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6" Dec 05 23:33:20 crc kubenswrapper[4747]: I1205 23:33:20.698014 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6"} err="failed to get container status \"0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6\": rpc error: code = NotFound desc = could not find container \"0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6\": container with ID starting with 0a08e43e670482fc1245b9e55fd914dbbdc954b56b1bca64f4fc2f7b38c5b1b6 not found: ID does not exist" Dec 05 23:33:21 crc kubenswrapper[4747]: I1205 23:33:21.854083 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" path="/var/lib/kubelet/pods/ebf66e23-3c10-4de6-84ae-8fc099eaf76e/volumes" Dec 05 23:34:06 crc kubenswrapper[4747]: I1205 23:34:06.222434 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:34:06 crc kubenswrapper[4747]: I1205 23:34:06.223189 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.904256 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:07 crc kubenswrapper[4747]: E1205 23:34:07.905178 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="registry-server" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.905196 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="registry-server" Dec 05 23:34:07 crc kubenswrapper[4747]: E1205 23:34:07.905249 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="extract-content" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.905257 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="extract-content" Dec 05 23:34:07 crc kubenswrapper[4747]: E1205 23:34:07.905279 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="extract-utilities" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.905287 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="extract-utilities" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.905568 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf66e23-3c10-4de6-84ae-8fc099eaf76e" containerName="registry-server" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.907528 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:07 crc kubenswrapper[4747]: I1205 23:34:07.913891 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.081879 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.082002 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmq79\" (UniqueName: \"kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.082148 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.184162 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.184271 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmq79\" (UniqueName: \"kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.184403 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.184880 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.184878 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.207904 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmq79\" (UniqueName: \"kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79\") pod \"redhat-operators-qnk7d\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.239607 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:08 crc kubenswrapper[4747]: I1205 23:34:08.771302 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:09 crc kubenswrapper[4747]: I1205 23:34:09.116660 4747 generic.go:334] "Generic (PLEG): container finished" podID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerID="74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9" exitCode=0 Dec 05 23:34:09 crc kubenswrapper[4747]: I1205 23:34:09.116716 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerDied","Data":"74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9"} Dec 05 23:34:09 crc kubenswrapper[4747]: I1205 23:34:09.116764 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerStarted","Data":"e40d64a2f47012becb55179a0a805860c28ce5e8113200706a36d75755524c8c"} Dec 05 23:34:09 crc kubenswrapper[4747]: I1205 23:34:09.118899 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:34:10 crc kubenswrapper[4747]: I1205 23:34:10.133047 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerStarted","Data":"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43"} Dec 05 23:34:15 crc kubenswrapper[4747]: I1205 23:34:15.185551 4747 generic.go:334] "Generic (PLEG): container finished" podID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerID="a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43" exitCode=0 Dec 05 23:34:15 crc kubenswrapper[4747]: I1205 23:34:15.186436 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerDied","Data":"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43"} Dec 05 23:34:17 crc kubenswrapper[4747]: I1205 23:34:17.204313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerStarted","Data":"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02"} Dec 05 23:34:17 crc kubenswrapper[4747]: I1205 23:34:17.223714 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnk7d" podStartSLOduration=3.158593206 podStartE2EDuration="10.223697158s" podCreationTimestamp="2025-12-05 23:34:07 +0000 UTC" firstStartedPulling="2025-12-05 23:34:09.118612898 +0000 UTC m=+10319.585920386" lastFinishedPulling="2025-12-05 23:34:16.18371683 +0000 UTC m=+10326.651024338" observedRunningTime="2025-12-05 23:34:17.221349219 +0000 UTC m=+10327.688656717" watchObservedRunningTime="2025-12-05 23:34:17.223697158 +0000 UTC m=+10327.691004646" Dec 05 23:34:18 crc kubenswrapper[4747]: I1205 23:34:18.240758 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:18 crc kubenswrapper[4747]: I1205 23:34:18.241119 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:19 crc kubenswrapper[4747]: I1205 23:34:19.320152 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qnk7d" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="registry-server" probeResult="failure" output=< Dec 05 23:34:19 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Dec 05 23:34:19 crc kubenswrapper[4747]: > Dec 05 23:34:28 crc kubenswrapper[4747]: I1205 23:34:28.286821 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:28 crc kubenswrapper[4747]: I1205 23:34:28.336113 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:28 crc kubenswrapper[4747]: I1205 23:34:28.526702 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:29 crc kubenswrapper[4747]: I1205 23:34:29.318297 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnk7d" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="registry-server" containerID="cri-o://b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02" gracePeriod=2 Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.312265 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.338394 4747 generic.go:334] "Generic (PLEG): container finished" podID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerID="b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02" exitCode=0 Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.338443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerDied","Data":"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02"} Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.338509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnk7d" event={"ID":"d24ff337-3ee6-4008-af98-a95231f6dfa9","Type":"ContainerDied","Data":"e40d64a2f47012becb55179a0a805860c28ce5e8113200706a36d75755524c8c"} Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.338533 4747 scope.go:117] "RemoveContainer" containerID="b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.341097 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnk7d" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.373461 4747 scope.go:117] "RemoveContainer" containerID="a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.414275 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmq79\" (UniqueName: \"kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79\") pod \"d24ff337-3ee6-4008-af98-a95231f6dfa9\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.414422 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities\") pod \"d24ff337-3ee6-4008-af98-a95231f6dfa9\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.414727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content\") pod \"d24ff337-3ee6-4008-af98-a95231f6dfa9\" (UID: \"d24ff337-3ee6-4008-af98-a95231f6dfa9\") " Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.416152 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities" (OuterVolumeSpecName: "utilities") pod "d24ff337-3ee6-4008-af98-a95231f6dfa9" (UID: "d24ff337-3ee6-4008-af98-a95231f6dfa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.422222 4747 scope.go:117] "RemoveContainer" containerID="74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.425859 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79" (OuterVolumeSpecName: "kube-api-access-vmq79") pod "d24ff337-3ee6-4008-af98-a95231f6dfa9" (UID: "d24ff337-3ee6-4008-af98-a95231f6dfa9"). InnerVolumeSpecName "kube-api-access-vmq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.492935 4747 scope.go:117] "RemoveContainer" containerID="b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02" Dec 05 23:34:30 crc kubenswrapper[4747]: E1205 23:34:30.493363 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02\": container with ID starting with b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02 not found: ID does not exist" containerID="b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.493416 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02"} err="failed to get container status \"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02\": rpc error: code = NotFound desc = could not find container \"b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02\": container with ID starting with b9ad412b1393c00cab40550691b334327f9e2e6a2ed16d93ddb0103b8b89fa02 not found: ID does not exist" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.493450 4747 scope.go:117] "RemoveContainer" containerID="a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43" Dec 05 23:34:30 crc kubenswrapper[4747]: E1205 23:34:30.494683 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43\": container with ID starting with a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43 not found: ID does not exist" containerID="a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.494736 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43"} err="failed to get container status \"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43\": rpc error: code = NotFound desc = could not find container \"a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43\": container with ID starting with a9398736e1292c7cda43516a7dbe5a214f16b9f499b7d7be640dc406a7756b43 not found: ID does not exist" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.494769 4747 scope.go:117] "RemoveContainer" containerID="74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9" Dec 05 23:34:30 crc kubenswrapper[4747]: E1205 23:34:30.495192 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9\": container with ID starting with 74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9 not found: ID does not exist" containerID="74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.495221 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9"} err="failed to get container status \"74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9\": rpc error: code = NotFound desc = could not find container \"74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9\": container with ID starting with 74aaebde45b9860b9429cd93d6c156e63a34272ea35ed2b0b0035ba3c8296db9 not found: ID does not exist" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.516954 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmq79\" (UniqueName: \"kubernetes.io/projected/d24ff337-3ee6-4008-af98-a95231f6dfa9-kube-api-access-vmq79\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.516991 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.555704 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d24ff337-3ee6-4008-af98-a95231f6dfa9" (UID: "d24ff337-3ee6-4008-af98-a95231f6dfa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.617978 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d24ff337-3ee6-4008-af98-a95231f6dfa9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.682174 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:30 crc kubenswrapper[4747]: I1205 23:34:30.696885 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnk7d"] Dec 05 23:34:31 crc kubenswrapper[4747]: I1205 23:34:31.852477 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" path="/var/lib/kubelet/pods/d24ff337-3ee6-4008-af98-a95231f6dfa9/volumes" Dec 05 23:34:36 crc kubenswrapper[4747]: I1205 23:34:36.222107 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:34:36 crc kubenswrapper[4747]: I1205 23:34:36.222647 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.221672 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.222368 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.222663 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.223394 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.223485 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf" gracePeriod=600 Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.714237 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf" exitCode=0 Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.714278 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf"} Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.714598 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d"} Dec 05 23:35:06 crc kubenswrapper[4747]: I1205 23:35:06.714618 4747 scope.go:117] "RemoveContainer" containerID="5bc6c3a43ff5a16cee009c6196d755df41949636f94d8ae88a072ed5a2b3b78e" Dec 05 23:35:13 crc kubenswrapper[4747]: I1205 23:35:13.865695 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_adbdbcfa-768c-4314-ac05-643065b2c85b/init-config-reloader/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.097979 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_adbdbcfa-768c-4314-ac05-643065b2c85b/alertmanager/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.105424 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_adbdbcfa-768c-4314-ac05-643065b2c85b/init-config-reloader/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.162128 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_adbdbcfa-768c-4314-ac05-643065b2c85b/config-reloader/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.313250 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6/aodh-api/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.480786 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6/aodh-evaluator/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.655831 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6/aodh-listener/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.745751 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dcd4b799-52f7-41ea-a1aa-aefcc25bb7b6/aodh-notifier/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.796479 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b8cc96f6-lbl9z_db7fceb8-9ede-4165-88c0-beaa3f74c0b2/barbican-api/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.862131 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b8cc96f6-lbl9z_db7fceb8-9ede-4165-88c0-beaa3f74c0b2/barbican-api-log/0.log" Dec 05 23:35:14 crc kubenswrapper[4747]: I1205 23:35:14.982936 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b6cffb4d8-fhwxv_93fb10f2-a4d6-44c7-b14a-7fd3275f9f87/barbican-keystone-listener/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.104812 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b6cffb4d8-fhwxv_93fb10f2-a4d6-44c7-b14a-7fd3275f9f87/barbican-keystone-listener-log/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.200745 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-849c4659df-skj5z_a905f2c0-0f3d-41ed-b796-caff00a1a314/barbican-worker/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.247718 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-849c4659df-skj5z_a905f2c0-0f3d-41ed-b796-caff00a1a314/barbican-worker-log/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.426488 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-6mdgd_b9a85f4f-ccf9-488d-bcfc-0adda4ca2995/bootstrap-openstack-openstack-cell1/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.504579 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_412f1967-e1ed-44a1-84b4-2cdd04e9cb10/ceilometer-central-agent/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.616701 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_412f1967-e1ed-44a1-84b4-2cdd04e9cb10/ceilometer-notification-agent/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.666256 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_412f1967-e1ed-44a1-84b4-2cdd04e9cb10/proxy-httpd/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.679423 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_412f1967-e1ed-44a1-84b4-2cdd04e9cb10/sg-core/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.894252 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_925daa00-d5f8-40fb-8ac6-dcab357bdd5b/cinder-api/0.log" Dec 05 23:35:15 crc kubenswrapper[4747]: I1205 23:35:15.913867 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_925daa00-d5f8-40fb-8ac6-dcab357bdd5b/cinder-api-log/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.059776 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9116d646-20e0-4541-8233-7c6a101b5e40/cinder-scheduler/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.164921 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9116d646-20e0-4541-8233-7c6a101b5e40/probe/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.292264 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-szjfh_d2dbd3ad-9f97-4cc0-838f-a76f2a8f9dda/configure-network-openstack-openstack-cell1/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.411824 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-67hcg_deb44e95-83e2-4cf3-8157-97ad96b784d0/configure-os-openstack-openstack-cell1/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.587066 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d69b66569-m2hjf_8af80a5a-f044-4431-8c68-aba1507de5d1/init/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.701908 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d69b66569-m2hjf_8af80a5a-f044-4431-8c68-aba1507de5d1/init/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.799868 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d69b66569-m2hjf_8af80a5a-f044-4431-8c68-aba1507de5d1/dnsmasq-dns/0.log" Dec 05 23:35:16 crc kubenswrapper[4747]: I1205 23:35:16.816548 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-8jkzz_9f5aeb3e-a494-400c-b82b-6ff8c1c4112b/download-cache-openstack-openstack-cell1/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.022831 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc8984e2-423b-4534-b56c-92b94bcb4155/glance-log/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.054957 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc8984e2-423b-4534-b56c-92b94bcb4155/glance-httpd/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.225262 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64efaafd-f823-4a41-ae6c-fd150138ebda/glance-httpd/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.268965 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64efaafd-f823-4a41-ae6c-fd150138ebda/glance-log/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.751737 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7579685f87-4bh5x_0388fe0f-6132-450a-a8e0-038f04d0c73c/heat-engine/0.log" Dec 05 23:35:17 crc kubenswrapper[4747]: I1205 23:35:17.999304 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-798d94bcc6-bvzqv_557c3581-e2a3-4fc1-a0e7-df3d9fd08ec9/heat-api/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.134538 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-595bbd95bb-ds75b_ac99fd7b-2307-4539-849a-774e9b2bc774/horizon/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.272533 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-cb88b49d8-v5mjj_2aa007bc-2c4d-406b-a962-5c82ec256692/heat-cfnapi/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.276569 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-5psgx_312cd54a-2efe-4a57-9e2d-5f295ebfbbb3/install-certs-openstack-openstack-cell1/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.585195 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-xxmz9_0a44c68b-4954-4588-82a6-0a4a6b2a01c4/install-os-openstack-openstack-cell1/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.704305 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-595bbd95bb-ds75b_ac99fd7b-2307-4539-849a-774e9b2bc774/horizon-log/0.log" Dec 05 23:35:18 crc kubenswrapper[4747]: I1205 23:35:18.773085 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55748bc777-pzh2t_13d471a9-2ca8-46d2-8be1-3b2354fe8fbb/keystone-api/0.log" Dec 05 23:35:19 crc kubenswrapper[4747]: I1205 23:35:19.460744 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416261-kkkjb_600ea9b0-d6cc-48a4-84ce-ba78b2013bb3/keystone-cron/0.log" Dec 05 23:35:19 crc kubenswrapper[4747]: I1205 23:35:19.517637 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1bb68bb7-3d25-45fd-bed9-8abf52f38a1f/kube-state-metrics/0.log" Dec 05 23:35:19 crc kubenswrapper[4747]: I1205 23:35:19.615078 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-5z757_1a42be6c-26a1-4081-8df5-9b5ee5d45262/libvirt-openstack-openstack-cell1/0.log" Dec 05 23:35:19 crc kubenswrapper[4747]: I1205 23:35:19.941459 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7db666fd57-25ch7_5f76856c-0d34-4880-9baa-0fd3b0dc3a36/neutron-httpd/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.030928 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7db666fd57-25ch7_5f76856c-0d34-4880-9baa-0fd3b0dc3a36/neutron-api/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.272867 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-9txj2_150d0c65-26e4-483f-aaf6-72d7efe808a3/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.355327 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-ks6lb_3b5bf5c5-a366-40d6-a7fc-1c4517b83f70/neutron-metadata-openstack-openstack-cell1/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.623324 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-fbvl9_24ace32a-9918-4ef5-92e7-a5ef6419e908/neutron-sriov-openstack-openstack-cell1/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.775137 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_924d0e7d-e7b6-4a55-8a58-87ea4a01ec38/nova-api-log/0.log" Dec 05 23:35:20 crc kubenswrapper[4747]: I1205 23:35:20.842271 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_924d0e7d-e7b6-4a55-8a58-87ea4a01ec38/nova-api-api/0.log" Dec 05 23:35:21 crc kubenswrapper[4747]: I1205 23:35:21.662257 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4b9536e5-5379-4d1e-acd6-92259c28a784/nova-cell0-conductor-conductor/0.log" Dec 05 23:35:21 crc kubenswrapper[4747]: I1205 23:35:21.679839 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_151ab70a-f3ce-4993-9c10-d0674b8208eb/nova-cell1-conductor-conductor/0.log" Dec 05 23:35:21 crc kubenswrapper[4747]: I1205 23:35:21.985304 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4f738e79-5b6d-4d71-b4c0-44f166659c2d/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.135423 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell99n9m_29780f9c-1925-4bc3-9a1d-24f9723dbcb9/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.425995 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-cx4rm_0805aa11-5ab8-4ab4-b0c2-0435e7cc68e5/nova-cell1-openstack-openstack-cell1/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.544263 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_abd9d55c-b859-4c21-8a3f-cdfd09e33bd4/nova-metadata-log/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.858894 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ab9ee7da-b527-4208-a7c5-51a6b1a2db01/nova-scheduler-scheduler/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.923086 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_abd9d55c-b859-4c21-8a3f-cdfd09e33bd4/nova-metadata-metadata/0.log" Dec 05 23:35:22 crc kubenswrapper[4747]: I1205 23:35:22.944334 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-67b9f9b8db-jj6ds_116ee497-2944-4847-89ae-c4c14828dfa2/init/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.097916 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-67b9f9b8db-jj6ds_116ee497-2944-4847-89ae-c4c14828dfa2/init/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.205844 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-67b9f9b8db-jj6ds_116ee497-2944-4847-89ae-c4c14828dfa2/octavia-api-provider-agent/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.347045 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-j547l_7aa65be4-2b09-4c31-97fc-a220541d4009/init/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.487362 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-67b9f9b8db-jj6ds_116ee497-2944-4847-89ae-c4c14828dfa2/octavia-api/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.618430 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-j547l_7aa65be4-2b09-4c31-97fc-a220541d4009/init/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.638866 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-j547l_7aa65be4-2b09-4c31-97fc-a220541d4009/octavia-healthmanager/0.log" Dec 05 23:35:23 crc kubenswrapper[4747]: I1205 23:35:23.759293 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-4wx4f_20f32969-4fa1-4d35-84b3-becaa9c774ea/init/0.log" Dec 05 23:35:24 crc kubenswrapper[4747]: I1205 23:35:24.054904 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-4wx4f_20f32969-4fa1-4d35-84b3-becaa9c774ea/octavia-housekeeping/0.log" Dec 05 23:35:24 crc kubenswrapper[4747]: I1205 23:35:24.068569 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-4wx4f_20f32969-4fa1-4d35-84b3-becaa9c774ea/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:24.997381 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-4rzpg_d5ed0900-293f-40e2-b0ba-54ba92a0d04c/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.205700 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-4rzpg_d5ed0900-293f-40e2-b0ba-54ba92a0d04c/octavia-amphora-httpd/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.277858 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-4rzpg_d5ed0900-293f-40e2-b0ba-54ba92a0d04c/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.286348 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pmwf5_61ad6ca6-026d-4ef6-9a8d-c414dd904933/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.502145 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pmwf5_61ad6ca6-026d-4ef6-9a8d-c414dd904933/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.548164 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-d7nzb_fb28b4bc-7d39-4d5f-af3e-d855503b94f3/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.592972 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pmwf5_61ad6ca6-026d-4ef6-9a8d-c414dd904933/octavia-rsyslog/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.849653 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-d7nzb_fb28b4bc-7d39-4d5f-af3e-d855503b94f3/init/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.913825 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7e2599a-2280-4bf7-a62c-60fcf447e74d/mysql-bootstrap/0.log" Dec 05 23:35:25 crc kubenswrapper[4747]: I1205 23:35:25.997207 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-d7nzb_fb28b4bc-7d39-4d5f-af3e-d855503b94f3/octavia-worker/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.232470 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7e2599a-2280-4bf7-a62c-60fcf447e74d/mysql-bootstrap/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.261159 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7e2599a-2280-4bf7-a62c-60fcf447e74d/galera/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.266477 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b0a2255-16ff-4be0-a16e-161076f32fc4/mysql-bootstrap/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.741414 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fcf2fe6e-3aa1-4a4b-877b-49db11eee9a5/openstackclient/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.745569 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b0a2255-16ff-4be0-a16e-161076f32fc4/mysql-bootstrap/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.802475 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b0a2255-16ff-4be0-a16e-161076f32fc4/galera/0.log" Dec 05 23:35:26 crc kubenswrapper[4747]: I1205 23:35:26.958462 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8wcg5_9a42efdc-ea18-4d70-b8f5-19336636b4f0/openstack-network-exporter/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.221101 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pbwjn_be15b94a-18ea-49ed-a209-4e1d6dbd6c62/ovsdb-server-init/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.416493 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pbwjn_be15b94a-18ea-49ed-a209-4e1d6dbd6c62/ovsdb-server-init/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.472379 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pbwjn_be15b94a-18ea-49ed-a209-4e1d6dbd6c62/ovsdb-server/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.477707 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pbwjn_be15b94a-18ea-49ed-a209-4e1d6dbd6c62/ovs-vswitchd/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.699237 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wl72w_3c5bd501-1b1a-48ba-b163-a96bd80e07fa/ovn-controller/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.735155 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d2c2179-ea52-4d02-b1de-9972369db4c2/openstack-network-exporter/0.log" Dec 05 23:35:27 crc kubenswrapper[4747]: I1205 23:35:27.881568 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d2c2179-ea52-4d02-b1de-9972369db4c2/ovn-northd/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.003436 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-n2p5x_0937b5af-022f-433a-8c04-71d2b729f11d/ovn-openstack-openstack-cell1/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.152887 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e95df327-27e4-4201-b4b9-c8f55bdff2a4/openstack-network-exporter/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.198977 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e95df327-27e4-4201-b4b9-c8f55bdff2a4/ovsdbserver-nb/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.373489 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_bcb13537-396f-464d-8bf2-22a7cf04c8bc/ovsdbserver-nb/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.386870 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_bcb13537-396f-464d-8bf2-22a7cf04c8bc/openstack-network-exporter/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.647391 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_cebaf6d4-544c-4d13-beda-2c5231a92d0c/openstack-network-exporter/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.690048 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_cebaf6d4-544c-4d13-beda-2c5231a92d0c/ovsdbserver-nb/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.742304 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f081a03e-60c9-41c7-8b24-255114a2998e/openstack-network-exporter/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.871453 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f081a03e-60c9-41c7-8b24-255114a2998e/ovsdbserver-sb/0.log" Dec 05 23:35:28 crc kubenswrapper[4747]: I1205 23:35:28.927940 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f91e4355-8711-4c53-9c55-273033340752/openstack-network-exporter/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.002340 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f91e4355-8711-4c53-9c55-273033340752/ovsdbserver-sb/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.139457 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5c17dd19-c351-4e05-b7be-ff471da69382/openstack-network-exporter/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.315006 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5c17dd19-c351-4e05-b7be-ff471da69382/ovsdbserver-sb/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.477317 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7bf9cb-q8n7g_29277589-fa87-483b-9c76-03ca12b29a20/placement-api/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.550294 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7bf9cb-q8n7g_29277589-fa87-483b-9c76-03ca12b29a20/placement-log/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.579106 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c5c6h7_5958f892-c6a2-4203-b7ef-f20a17c75771/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 05 23:35:29 crc kubenswrapper[4747]: I1205 23:35:29.800375 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_47e6cc39-27de-45da-92cf-2a13effd0974/init-config-reloader/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.037198 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_47e6cc39-27de-45da-92cf-2a13effd0974/config-reloader/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.037807 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_47e6cc39-27de-45da-92cf-2a13effd0974/init-config-reloader/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.055719 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_47e6cc39-27de-45da-92cf-2a13effd0974/prometheus/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.061040 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_47e6cc39-27de-45da-92cf-2a13effd0974/thanos-sidecar/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.261273 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b49058e8-36ba-424d-a6bf-732f2b0545ff/setup-container/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.564909 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b49058e8-36ba-424d-a6bf-732f2b0545ff/setup-container/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.598324 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b49058e8-36ba-424d-a6bf-732f2b0545ff/rabbitmq/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.694738 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_97a2f30d-7404-4264-830e-ef43a223735e/setup-container/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.868347 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_97a2f30d-7404-4264-830e-ef43a223735e/setup-container/0.log" Dec 05 23:35:30 crc kubenswrapper[4747]: I1205 23:35:30.967872 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-8crf9_f6a9b5e3-e724-4bf3-8d1d-f99a1b8fdedb/reboot-os-openstack-openstack-cell1/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.180368 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_97a2f30d-7404-4264-830e-ef43a223735e/rabbitmq/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.195658 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-bbmn4_91fc9e2a-45ad-4216-a340-84c55cba490e/run-os-openstack-openstack-cell1/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.496356 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-m97hr_fa8817f9-6605-4396-94a2-4a2106fa6724/ssh-known-hosts-openstack/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.589279 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f55995b6-9n2p4_ca9730ab-ce9c-4f56-a81e-14c78ac858ab/proxy-server/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.777057 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-ctndg_d5c20dd3-0235-4b3b-8ff5-1a32e8f4d18b/swift-ring-rebalance/0.log" Dec 05 23:35:31 crc kubenswrapper[4747]: I1205 23:35:31.794186 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f55995b6-9n2p4_ca9730ab-ce9c-4f56-a81e-14c78ac858ab/proxy-httpd/0.log" Dec 05 23:35:32 crc kubenswrapper[4747]: I1205 23:35:32.039368 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-znq8d_b61e75e7-0701-45f0-a70e-8e660be43224/telemetry-openstack-openstack-cell1/0.log" Dec 05 23:35:32 crc kubenswrapper[4747]: I1205 23:35:32.110507 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-2wq77_a4a0930b-16ca-4e49-a32c-d28b4a9fac9f/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 05 23:35:32 crc kubenswrapper[4747]: I1205 23:35:32.320745 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-7wxtb_060c3c6c-211e-4b44-9f26-1e1d24be0f83/validate-network-openstack-openstack-cell1/0.log" Dec 05 23:35:36 crc kubenswrapper[4747]: I1205 23:35:36.106520 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_72f13c5c-74d7-4d4c-8518-8bc6c539170b/memcached/0.log" Dec 05 23:35:40 crc kubenswrapper[4747]: I1205 23:35:40.311688 4747 scope.go:117] "RemoveContainer" containerID="d5d82123940ecfef26236c088bf55333c6026dd64c1975c0111496c9ed76af48" Dec 05 23:35:40 crc kubenswrapper[4747]: I1205 23:35:40.337404 4747 scope.go:117] "RemoveContainer" containerID="c9afd47bcbae299578e0e13ab3ea7f2f4b5eae310d36dcf61aea7de7589f3e38" Dec 05 23:35:40 crc kubenswrapper[4747]: I1205 23:35:40.406210 4747 scope.go:117] "RemoveContainer" containerID="d586e773e2249a234cbe54991bc80debb9f31fbe43a0b10ba508b5e47d568077" Dec 05 23:36:01 crc kubenswrapper[4747]: I1205 23:36:01.779297 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/util/0.log" Dec 05 23:36:01 crc kubenswrapper[4747]: I1205 23:36:01.963907 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/pull/0.log" Dec 05 23:36:01 crc kubenswrapper[4747]: I1205 23:36:01.974210 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/util/0.log" Dec 05 23:36:01 crc kubenswrapper[4747]: I1205 23:36:01.991126 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/pull/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.193605 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/extract/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.264620 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/util/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.286007 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafq98gp_0bea60ba-9571-4b9e-8484-5da33bd67047/pull/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.447025 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-p9kpn_376f6574-b930-4baa-98c4-d4b47f3b7e76/kube-rbac-proxy/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.512399 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lgxh8_4fa9efa6-f9bd-47d1-b27c-701f742537f8/kube-rbac-proxy/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.589558 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-p9kpn_376f6574-b930-4baa-98c4-d4b47f3b7e76/manager/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.763468 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-lcwxh_6b18b68e-5851-4b54-a953-aa0f9151f191/kube-rbac-proxy/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.782200 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lgxh8_4fa9efa6-f9bd-47d1-b27c-701f742537f8/manager/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.790533 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-lcwxh_6b18b68e-5851-4b54-a953-aa0f9151f191/manager/0.log" Dec 05 23:36:02 crc kubenswrapper[4747]: I1205 23:36:02.992081 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6qc7c_ff9bc6be-0b60-4c4b-a692-5de429d22c0d/kube-rbac-proxy/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.148024 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6qc7c_ff9bc6be-0b60-4c4b-a692-5de429d22c0d/manager/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.165777 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-8c9z4_60cd02a8-1ab2-4037-b5d8-e479875ce1db/kube-rbac-proxy/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.267705 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-8c9z4_60cd02a8-1ab2-4037-b5d8-e479875ce1db/manager/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.361829 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tw5n8_af277afe-81a7-4ca7-8e88-dee226ed11a3/kube-rbac-proxy/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.430480 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tw5n8_af277afe-81a7-4ca7-8e88-dee226ed11a3/manager/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.522365 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n9bsq_2cb7e739-bfdd-4785-98bc-88c011e1f703/kube-rbac-proxy/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.714143 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-llksb_0656c174-0497-42ca-9abe-b7c50e82bdec/kube-rbac-proxy/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.716703 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-llksb_0656c174-0497-42ca-9abe-b7c50e82bdec/manager/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.898257 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n9bsq_2cb7e739-bfdd-4785-98bc-88c011e1f703/manager/0.log" Dec 05 23:36:03 crc kubenswrapper[4747]: I1205 23:36:03.984743 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s2xls_4157fb94-c6ec-4eba-9020-50efd318640f/kube-rbac-proxy/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.076418 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s2xls_4157fb94-c6ec-4eba-9020-50efd318640f/manager/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.177808 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-7lrth_0d3e833b-6977-47a1-8ba3-b09d093f303c/kube-rbac-proxy/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.186494 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-7lrth_0d3e833b-6977-47a1-8ba3-b09d093f303c/manager/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.421441 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lc7bg_391b5a9d-822c-4980-9eb0-376dd5d44126/kube-rbac-proxy/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.491370 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lc7bg_391b5a9d-822c-4980-9eb0-376dd5d44126/manager/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.791410 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f7fp2_54d11a3d-e575-4a91-8856-97ab3f4adb1c/kube-rbac-proxy/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.923106 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f7fp2_54d11a3d-e575-4a91-8856-97ab3f4adb1c/manager/0.log" Dec 05 23:36:04 crc kubenswrapper[4747]: I1205 23:36:04.994939 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-42pxn_3659825b-dd9f-40b9-a17a-3c931805fe9c/kube-rbac-proxy/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.242120 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7k88v_cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd/kube-rbac-proxy/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.269080 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7k88v_cbc1eb34-f65f-4c57-9d9f-549f5eb7d9bd/manager/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.285549 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-42pxn_3659825b-dd9f-40b9-a17a-3c931805fe9c/manager/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.439554 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5dbwbl_9438f8d5-2c04-4bd1-9e4b-cc29d4565085/kube-rbac-proxy/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.478396 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5dbwbl_9438f8d5-2c04-4bd1-9e4b-cc29d4565085/manager/0.log" Dec 05 23:36:05 crc kubenswrapper[4747]: I1205 23:36:05.880225 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-mjqc9_9a778748-0d28-41ea-9d92-ba2e95f46a80/operator/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.049312 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-kz2bv_cb5a89d9-1303-4cbc-9641-8be514157ed0/kube-rbac-proxy/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.115942 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kk22k_466f7493-21f0-40e0-addb-35e631da4792/registry-server/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.287281 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-kz2bv_cb5a89d9-1303-4cbc-9641-8be514157ed0/manager/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.332273 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8ltc2_fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4/kube-rbac-proxy/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.368793 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8ltc2_fe32c200-1897-4d84-9eeb-ec5c5e7b2cd4/manager/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.574331 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7mb56_0fbf81b5-1f99-402f-b486-0801f63077e4/operator/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.644871 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-ml4vm_44d3f632-8fb0-4763-adaa-0f54fdb8386f/kube-rbac-proxy/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.797629 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-ml4vm_44d3f632-8fb0-4763-adaa-0f54fdb8386f/manager/0.log" Dec 05 23:36:06 crc kubenswrapper[4747]: I1205 23:36:06.840658 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hjzcg_f46e2df3-4e49-4818-91d9-1181d78e46f9/kube-rbac-proxy/0.log" Dec 05 23:36:07 crc kubenswrapper[4747]: I1205 23:36:07.071530 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-dvpxs_be975bb9-2086-4fde-8c71-9c7f56eaf8ea/kube-rbac-proxy/0.log" Dec 05 23:36:07 crc kubenswrapper[4747]: I1205 23:36:07.171863 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-dvpxs_be975bb9-2086-4fde-8c71-9c7f56eaf8ea/manager/0.log" Dec 05 23:36:07 crc kubenswrapper[4747]: I1205 23:36:07.202371 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hjzcg_f46e2df3-4e49-4818-91d9-1181d78e46f9/manager/0.log" Dec 05 23:36:07 crc kubenswrapper[4747]: I1205 23:36:07.358924 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-8mfrt_4ac7a709-1eb7-4f8f-ac46-3a404b2171a7/kube-rbac-proxy/0.log" Dec 05 23:36:07 crc kubenswrapper[4747]: I1205 23:36:07.415630 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-8mfrt_4ac7a709-1eb7-4f8f-ac46-3a404b2171a7/manager/0.log" Dec 05 23:36:08 crc kubenswrapper[4747]: I1205 23:36:08.294796 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-rsqv9_fbdd4eb9-cffd-44b7-9b05-da1328de2fe6/manager/0.log" Dec 05 23:36:29 crc kubenswrapper[4747]: I1205 23:36:29.866916 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xgqvl_280e109e-854a-430c-9960-6af6bba9dfdc/control-plane-machine-set-operator/0.log" Dec 05 23:36:30 crc kubenswrapper[4747]: I1205 23:36:30.062172 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cdhgg_1840e48c-ee91-48d2-8ddb-34f24dde58ff/kube-rbac-proxy/0.log" Dec 05 23:36:30 crc kubenswrapper[4747]: I1205 23:36:30.132473 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cdhgg_1840e48c-ee91-48d2-8ddb-34f24dde58ff/machine-api-operator/0.log" Dec 05 23:36:45 crc kubenswrapper[4747]: I1205 23:36:45.026371 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-prvhw_604860ec-c0f5-4280-a534-798d3b88cf8e/cert-manager-controller/0.log" Dec 05 23:36:45 crc kubenswrapper[4747]: I1205 23:36:45.412299 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-cr86z_d2a92677-713b-4051-86dc-895902e83027/cert-manager-cainjector/0.log" Dec 05 23:36:45 crc kubenswrapper[4747]: I1205 23:36:45.451311 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-zltlr_03ba7ba4-8a3a-47b4-8996-cb3f8f9642af/cert-manager-webhook/0.log" Dec 05 23:36:57 crc kubenswrapper[4747]: I1205 23:36:57.630107 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-hn2jh_4548c537-2784-461e-b042-2a7efed9ae3a/nmstate-console-plugin/0.log" Dec 05 23:36:57 crc kubenswrapper[4747]: I1205 23:36:57.799932 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7tm7k_fed6a17f-26e6-4ad2-aa77-63e697e05f1f/nmstate-handler/0.log" Dec 05 23:36:57 crc kubenswrapper[4747]: I1205 23:36:57.842257 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8n9rp_a6577b22-4d4b-49a1-9e24-ff749998eee5/kube-rbac-proxy/0.log" Dec 05 23:36:57 crc kubenswrapper[4747]: I1205 23:36:57.895011 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8n9rp_a6577b22-4d4b-49a1-9e24-ff749998eee5/nmstate-metrics/0.log" Dec 05 23:36:58 crc kubenswrapper[4747]: I1205 23:36:58.044102 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-nt4x7_831590ed-85ad-424e-a4b7-83d97c56265c/nmstate-operator/0.log" Dec 05 23:36:58 crc kubenswrapper[4747]: I1205 23:36:58.119672 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-mvgbw_bfa081f3-bb53-463e-8dc4-af51189a16c8/nmstate-webhook/0.log" Dec 05 23:37:06 crc kubenswrapper[4747]: I1205 23:37:06.221541 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:37:06 crc kubenswrapper[4747]: I1205 23:37:06.222258 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.047115 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qs6g9_8ff67434-1b15-4bcd-a222-175dcaf8dbbb/kube-rbac-proxy/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.243000 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-frr-files/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.390318 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qs6g9_8ff67434-1b15-4bcd-a222-175dcaf8dbbb/controller/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.521379 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-reloader/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.524507 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-metrics/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.541995 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-frr-files/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.596176 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-reloader/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.816018 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-reloader/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.830426 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-metrics/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.838449 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-frr-files/0.log" Dec 05 23:37:14 crc kubenswrapper[4747]: I1205 23:37:14.849174 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-metrics/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.054831 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-frr-files/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.064352 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-metrics/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.068078 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/cp-reloader/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.117786 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/controller/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.251492 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/frr-metrics/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.339407 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/kube-rbac-proxy/0.log" Dec 05 23:37:15 crc kubenswrapper[4747]: I1205 23:37:15.396727 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/kube-rbac-proxy-frr/0.log" Dec 05 23:37:16 crc kubenswrapper[4747]: I1205 23:37:16.101196 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5blxd_26a6d099-70ce-4ce9-b75e-b7696ffe6dea/frr-k8s-webhook-server/0.log" Dec 05 23:37:16 crc kubenswrapper[4747]: I1205 23:37:16.104721 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/reloader/0.log" Dec 05 23:37:16 crc kubenswrapper[4747]: I1205 23:37:16.322394 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c77dc54f8-9hjvq_a6646e53-609f-4dc3-b299-aae36ce927fc/manager/0.log" Dec 05 23:37:16 crc kubenswrapper[4747]: I1205 23:37:16.581806 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d5d4d9899-rfr7f_2a21b504-6988-45fc-88b9-4108efc34b06/webhook-server/0.log" Dec 05 23:37:16 crc kubenswrapper[4747]: I1205 23:37:16.758022 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8z2dk_1b95125e-a8ab-4566-bf3d-d901a5c4044b/kube-rbac-proxy/0.log" Dec 05 23:37:17 crc kubenswrapper[4747]: I1205 23:37:17.767547 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8z2dk_1b95125e-a8ab-4566-bf3d-d901a5c4044b/speaker/0.log" Dec 05 23:37:18 crc kubenswrapper[4747]: I1205 23:37:18.982255 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2t7n7_393b222d-ddf3-41e6-92c8-7379f58219aa/frr/0.log" Dec 05 23:37:31 crc kubenswrapper[4747]: I1205 23:37:31.868185 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/util/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.147432 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/pull/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.188743 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/pull/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.204075 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/util/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.342971 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/util/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.407474 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/pull/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.481095 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adcdgt_5fe45ed8-9c96-4564-9e57-f4eaef897ef4/extract/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.598231 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/util/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.856058 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/pull/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.856282 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/pull/0.log" Dec 05 23:37:32 crc kubenswrapper[4747]: I1205 23:37:32.877003 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/util/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.036375 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/util/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.067524 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/pull/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.150040 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbjsc6_4d8159d7-7e1b-4c53-bd04-d806c4165588/extract/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.329737 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/util/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.495357 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/util/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.499895 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/pull/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.559238 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/pull/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.722366 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/util/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.775345 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/extract/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.776129 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92106rdrx_d96ced6d-96bd-4f59-af54-33aaba6b3b0a/pull/0.log" Dec 05 23:37:33 crc kubenswrapper[4747]: I1205 23:37:33.940870 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/util/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.294266 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/pull/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.315871 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/pull/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.355655 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/util/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.582353 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/extract/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.590069 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/pull/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.612769 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xf5jt_05f0028b-0044-41f5-84a5-0116ec549d2f/util/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.764536 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-utilities/0.log" Dec 05 23:37:34 crc kubenswrapper[4747]: I1205 23:37:34.980400 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-content/0.log" Dec 05 23:37:35 crc kubenswrapper[4747]: I1205 23:37:35.034533 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-content/0.log" Dec 05 23:37:35 crc kubenswrapper[4747]: I1205 23:37:35.037816 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-utilities/0.log" Dec 05 23:37:35 crc kubenswrapper[4747]: I1205 23:37:35.332480 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-utilities/0.log" Dec 05 23:37:35 crc kubenswrapper[4747]: I1205 23:37:35.417195 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/extract-content/0.log" Dec 05 23:37:35 crc kubenswrapper[4747]: I1205 23:37:35.631752 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-utilities/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.221389 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.221745 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.254745 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-utilities/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.272448 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-content/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.289882 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-content/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.645095 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-djp6w_10893b71-c0c0-4955-8d6f-eecbd1e69d68/registry-server/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.688439 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-utilities/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.694836 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/extract-content/0.log" Dec 05 23:37:36 crc kubenswrapper[4747]: I1205 23:37:36.951185 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-utilities/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.033242 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m4nzv_6ce1891f-9919-4ec9-bf2f-9662d075b240/marketplace-operator/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.228091 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-utilities/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.246275 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-content/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.289444 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-content/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.605877 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-content/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.606081 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/extract-utilities/0.log" Dec 05 23:37:37 crc kubenswrapper[4747]: I1205 23:37:37.867259 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-utilities/0.log" Dec 05 23:37:38 crc kubenswrapper[4747]: I1205 23:37:38.244485 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d74r5_5799e2a0-e208-4e0b-b757-d65fe1f2f859/registry-server/0.log" Dec 05 23:37:38 crc kubenswrapper[4747]: I1205 23:37:38.338263 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cnfnj_96a86291-ce6c-4ab0-afc5-a87fbbc0fa78/registry-server/0.log" Dec 05 23:37:38 crc kubenswrapper[4747]: I1205 23:37:38.647176 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-utilities/0.log" Dec 05 23:37:38 crc kubenswrapper[4747]: I1205 23:37:38.771089 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-content/0.log" Dec 05 23:37:38 crc kubenswrapper[4747]: I1205 23:37:38.844660 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-content/0.log" Dec 05 23:37:39 crc kubenswrapper[4747]: I1205 23:37:39.080431 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-utilities/0.log" Dec 05 23:37:39 crc kubenswrapper[4747]: I1205 23:37:39.084254 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/extract-content/0.log" Dec 05 23:37:40 crc kubenswrapper[4747]: I1205 23:37:40.263699 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mxk7k_b295f21e-14da-4faa-97a2-6fa2d1f9702a/registry-server/0.log" Dec 05 23:37:53 crc kubenswrapper[4747]: I1205 23:37:53.482453 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-8s8c6_6a157993-4e7c-4c79-a126-ffd16308cb25/prometheus-operator/0.log" Dec 05 23:37:53 crc kubenswrapper[4747]: I1205 23:37:53.660903 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f5c965d9c-hlh9f_15183649-4b35-40ef-a6b7-2b1c786246b9/prometheus-operator-admission-webhook/0.log" Dec 05 23:37:53 crc kubenswrapper[4747]: I1205 23:37:53.684030 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f5c965d9c-fn4ww_d44ad8b8-f155-4a3b-a7cf-27aa9951dfd3/prometheus-operator-admission-webhook/0.log" Dec 05 23:37:53 crc kubenswrapper[4747]: I1205 23:37:53.869739 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-5vf5m_67f34afc-ce90-42ca-8599-60e931bb3868/operator/0.log" Dec 05 23:37:53 crc kubenswrapper[4747]: I1205 23:37:53.919825 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-sp22q_29cb108e-f1f3-4cbf-8a59-4b90ee5b1f01/perses-operator/0.log" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.221497 4747 patch_prober.go:28] interesting pod/machine-config-daemon-7lblw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.222284 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.222335 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.223807 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d"} pod="openshift-machine-config-operator/machine-config-daemon-7lblw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.223899 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerName="machine-config-daemon" containerID="cri-o://c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" gracePeriod=600 Dec 05 23:38:06 crc kubenswrapper[4747]: E1205 23:38:06.350342 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.638069 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ba28a1-00e9-438e-9b47-6537f75121bb" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" exitCode=0 Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.638147 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerDied","Data":"c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d"} Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.638404 4747 scope.go:117] "RemoveContainer" containerID="a40402e37b5afe2cc52d1df1c1a7f1070d0aacf27bed182b1f1e084040a254bf" Dec 05 23:38:06 crc kubenswrapper[4747]: I1205 23:38:06.639094 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:38:06 crc kubenswrapper[4747]: E1205 23:38:06.639349 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:38:17 crc kubenswrapper[4747]: I1205 23:38:17.840577 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:38:17 crc kubenswrapper[4747]: E1205 23:38:17.841374 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:38:23 crc kubenswrapper[4747]: E1205 23:38:23.580454 4747 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.22:59764->38.102.83.22:39817: write tcp 38.102.83.22:59764->38.102.83.22:39817: write: broken pipe Dec 05 23:38:30 crc kubenswrapper[4747]: I1205 23:38:30.839894 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:38:30 crc kubenswrapper[4747]: E1205 23:38:30.842288 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:38:43 crc kubenswrapper[4747]: I1205 23:38:43.840486 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:38:43 crc kubenswrapper[4747]: E1205 23:38:43.842514 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:38:58 crc kubenswrapper[4747]: I1205 23:38:58.839887 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:38:58 crc kubenswrapper[4747]: E1205 23:38:58.840807 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:39:13 crc kubenswrapper[4747]: I1205 23:39:13.840263 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:39:13 crc kubenswrapper[4747]: E1205 23:39:13.841073 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:39:27 crc kubenswrapper[4747]: I1205 23:39:27.841384 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:39:27 crc kubenswrapper[4747]: E1205 23:39:27.842481 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.574091 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:32 crc kubenswrapper[4747]: E1205 23:39:32.575135 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="registry-server" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.575154 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="registry-server" Dec 05 23:39:32 crc kubenswrapper[4747]: E1205 23:39:32.575198 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="extract-content" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.575209 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="extract-content" Dec 05 23:39:32 crc kubenswrapper[4747]: E1205 23:39:32.575230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="extract-utilities" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.575241 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="extract-utilities" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.575473 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24ff337-3ee6-4008-af98-a95231f6dfa9" containerName="registry-server" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.577289 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.595554 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.655176 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.655458 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.655654 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk9c\" (UniqueName: \"kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.758010 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.758095 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.758151 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xk9c\" (UniqueName: \"kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.758493 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.758631 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.783809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xk9c\" (UniqueName: \"kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c\") pod \"redhat-marketplace-j8zzg\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:32 crc kubenswrapper[4747]: I1205 23:39:32.903635 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:33 crc kubenswrapper[4747]: I1205 23:39:33.575435 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:34 crc kubenswrapper[4747]: I1205 23:39:34.560095 4747 generic.go:334] "Generic (PLEG): container finished" podID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerID="02d5db0c6b65105c89620e53df000ef4841a9cb74d7120b9b56a58af6952ccc7" exitCode=0 Dec 05 23:39:34 crc kubenswrapper[4747]: I1205 23:39:34.560174 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerDied","Data":"02d5db0c6b65105c89620e53df000ef4841a9cb74d7120b9b56a58af6952ccc7"} Dec 05 23:39:34 crc kubenswrapper[4747]: I1205 23:39:34.560577 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerStarted","Data":"f4a9cf2e8eece353ccb143c2c035c9005058acd17945af9622189da46f52e34a"} Dec 05 23:39:34 crc kubenswrapper[4747]: I1205 23:39:34.562708 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:39:36 crc kubenswrapper[4747]: I1205 23:39:36.583643 4747 generic.go:334] "Generic (PLEG): container finished" podID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerID="55f8bb147ad72adba61fa2cc3daeb1c463ccc3291235de2ffd7f0b1cb7f29207" exitCode=0 Dec 05 23:39:36 crc kubenswrapper[4747]: I1205 23:39:36.583683 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerDied","Data":"55f8bb147ad72adba61fa2cc3daeb1c463ccc3291235de2ffd7f0b1cb7f29207"} Dec 05 23:39:38 crc kubenswrapper[4747]: I1205 23:39:38.603509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerStarted","Data":"3d1f15c9d64303b5739412b87448a6d5f787772397c820cf1c53358db6c3f938"} Dec 05 23:39:38 crc kubenswrapper[4747]: I1205 23:39:38.628970 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j8zzg" podStartSLOduration=4.197819696 podStartE2EDuration="6.628951344s" podCreationTimestamp="2025-12-05 23:39:32 +0000 UTC" firstStartedPulling="2025-12-05 23:39:34.562282744 +0000 UTC m=+10645.029590272" lastFinishedPulling="2025-12-05 23:39:36.993414422 +0000 UTC m=+10647.460721920" observedRunningTime="2025-12-05 23:39:38.6210898 +0000 UTC m=+10649.088397288" watchObservedRunningTime="2025-12-05 23:39:38.628951344 +0000 UTC m=+10649.096258832" Dec 05 23:39:40 crc kubenswrapper[4747]: I1205 23:39:40.839816 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:39:40 crc kubenswrapper[4747]: E1205 23:39:40.840965 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:39:42 crc kubenswrapper[4747]: I1205 23:39:42.905366 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:42 crc kubenswrapper[4747]: I1205 23:39:42.906267 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:42 crc kubenswrapper[4747]: I1205 23:39:42.963114 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:43 crc kubenswrapper[4747]: I1205 23:39:43.718552 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:43 crc kubenswrapper[4747]: I1205 23:39:43.786697 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:45 crc kubenswrapper[4747]: I1205 23:39:45.682354 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j8zzg" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="registry-server" containerID="cri-o://3d1f15c9d64303b5739412b87448a6d5f787772397c820cf1c53358db6c3f938" gracePeriod=2 Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.721045 4747 generic.go:334] "Generic (PLEG): container finished" podID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerID="3d1f15c9d64303b5739412b87448a6d5f787772397c820cf1c53358db6c3f938" exitCode=0 Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.721293 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerDied","Data":"3d1f15c9d64303b5739412b87448a6d5f787772397c820cf1c53358db6c3f938"} Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.890087 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.984071 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xk9c\" (UniqueName: \"kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c\") pod \"f79a6703-5bd6-4197-afd2-94161eee1ef1\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.984145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content\") pod \"f79a6703-5bd6-4197-afd2-94161eee1ef1\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.984348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities\") pod \"f79a6703-5bd6-4197-afd2-94161eee1ef1\" (UID: \"f79a6703-5bd6-4197-afd2-94161eee1ef1\") " Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.985261 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities" (OuterVolumeSpecName: "utilities") pod "f79a6703-5bd6-4197-afd2-94161eee1ef1" (UID: "f79a6703-5bd6-4197-afd2-94161eee1ef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:46 crc kubenswrapper[4747]: I1205 23:39:46.990390 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c" (OuterVolumeSpecName: "kube-api-access-9xk9c") pod "f79a6703-5bd6-4197-afd2-94161eee1ef1" (UID: "f79a6703-5bd6-4197-afd2-94161eee1ef1"). InnerVolumeSpecName "kube-api-access-9xk9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.018044 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f79a6703-5bd6-4197-afd2-94161eee1ef1" (UID: "f79a6703-5bd6-4197-afd2-94161eee1ef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.086958 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.086995 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xk9c\" (UniqueName: \"kubernetes.io/projected/f79a6703-5bd6-4197-afd2-94161eee1ef1-kube-api-access-9xk9c\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.087008 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a6703-5bd6-4197-afd2-94161eee1ef1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.736769 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8zzg" event={"ID":"f79a6703-5bd6-4197-afd2-94161eee1ef1","Type":"ContainerDied","Data":"f4a9cf2e8eece353ccb143c2c035c9005058acd17945af9622189da46f52e34a"} Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.736807 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8zzg" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.736826 4747 scope.go:117] "RemoveContainer" containerID="3d1f15c9d64303b5739412b87448a6d5f787772397c820cf1c53358db6c3f938" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.776103 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.786565 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8zzg"] Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.794108 4747 scope.go:117] "RemoveContainer" containerID="55f8bb147ad72adba61fa2cc3daeb1c463ccc3291235de2ffd7f0b1cb7f29207" Dec 05 23:39:47 crc kubenswrapper[4747]: I1205 23:39:47.867349 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" path="/var/lib/kubelet/pods/f79a6703-5bd6-4197-afd2-94161eee1ef1/volumes" Dec 05 23:39:48 crc kubenswrapper[4747]: I1205 23:39:48.301715 4747 scope.go:117] "RemoveContainer" containerID="02d5db0c6b65105c89620e53df000ef4841a9cb74d7120b9b56a58af6952ccc7" Dec 05 23:39:51 crc kubenswrapper[4747]: I1205 23:39:51.840968 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:39:51 crc kubenswrapper[4747]: E1205 23:39:51.841892 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:40:03 crc kubenswrapper[4747]: I1205 23:40:03.840494 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:40:03 crc kubenswrapper[4747]: E1205 23:40:03.841683 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:40:07 crc kubenswrapper[4747]: I1205 23:40:07.971512 4747 generic.go:334] "Generic (PLEG): container finished" podID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerID="4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90" exitCode=0 Dec 05 23:40:07 crc kubenswrapper[4747]: I1205 23:40:07.971612 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" event={"ID":"63ca8613-9c9c-49b1-ade8-e5928fceef4d","Type":"ContainerDied","Data":"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90"} Dec 05 23:40:07 crc kubenswrapper[4747]: I1205 23:40:07.972841 4747 scope.go:117] "RemoveContainer" containerID="4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90" Dec 05 23:40:08 crc kubenswrapper[4747]: I1205 23:40:08.343094 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjlqd_must-gather-q8dkt_63ca8613-9c9c-49b1-ade8-e5928fceef4d/gather/0.log" Dec 05 23:40:14 crc kubenswrapper[4747]: I1205 23:40:14.841386 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:40:14 crc kubenswrapper[4747]: E1205 23:40:14.842149 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.075257 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjlqd/must-gather-q8dkt"] Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.076931 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="copy" containerID="cri-o://a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d" gracePeriod=2 Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.089348 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjlqd/must-gather-q8dkt"] Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.534827 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjlqd_must-gather-q8dkt_63ca8613-9c9c-49b1-ade8-e5928fceef4d/copy/0.log" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.535469 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.587996 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plwh8\" (UniqueName: \"kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8\") pod \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.588292 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output\") pod \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\" (UID: \"63ca8613-9c9c-49b1-ade8-e5928fceef4d\") " Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.594497 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8" (OuterVolumeSpecName: "kube-api-access-plwh8") pod "63ca8613-9c9c-49b1-ade8-e5928fceef4d" (UID: "63ca8613-9c9c-49b1-ade8-e5928fceef4d"). InnerVolumeSpecName "kube-api-access-plwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.691640 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plwh8\" (UniqueName: \"kubernetes.io/projected/63ca8613-9c9c-49b1-ade8-e5928fceef4d-kube-api-access-plwh8\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.790435 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "63ca8613-9c9c-49b1-ade8-e5928fceef4d" (UID: "63ca8613-9c9c-49b1-ade8-e5928fceef4d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.793987 4747 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63ca8613-9c9c-49b1-ade8-e5928fceef4d-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:17 crc kubenswrapper[4747]: I1205 23:40:17.852970 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" path="/var/lib/kubelet/pods/63ca8613-9c9c-49b1-ade8-e5928fceef4d/volumes" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.075501 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjlqd_must-gather-q8dkt_63ca8613-9c9c-49b1-ade8-e5928fceef4d/copy/0.log" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.076065 4747 generic.go:334] "Generic (PLEG): container finished" podID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerID="a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d" exitCode=143 Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.076123 4747 scope.go:117] "RemoveContainer" containerID="a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.076175 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjlqd/must-gather-q8dkt" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.095117 4747 scope.go:117] "RemoveContainer" containerID="4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.136706 4747 scope.go:117] "RemoveContainer" containerID="a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d" Dec 05 23:40:18 crc kubenswrapper[4747]: E1205 23:40:18.137221 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d\": container with ID starting with a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d not found: ID does not exist" containerID="a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.137299 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d"} err="failed to get container status \"a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d\": rpc error: code = NotFound desc = could not find container \"a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d\": container with ID starting with a97b37e79a1cc2fe97da53dbe1e9592a81b65c2b84e8921ef8ce82432c18bb2d not found: ID does not exist" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.137334 4747 scope.go:117] "RemoveContainer" containerID="4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90" Dec 05 23:40:18 crc kubenswrapper[4747]: E1205 23:40:18.137813 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90\": container with ID starting with 4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90 not found: ID does not exist" containerID="4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90" Dec 05 23:40:18 crc kubenswrapper[4747]: I1205 23:40:18.137858 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90"} err="failed to get container status \"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90\": rpc error: code = NotFound desc = could not find container \"4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90\": container with ID starting with 4654f9fc1eb9e4ef79abfac2b5f73ea5d1b1bc01b7a63cf09c016dc899570d90 not found: ID does not exist" Dec 05 23:40:26 crc kubenswrapper[4747]: I1205 23:40:26.840504 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:40:26 crc kubenswrapper[4747]: E1205 23:40:26.841259 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:40:38 crc kubenswrapper[4747]: I1205 23:40:38.840344 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:40:38 crc kubenswrapper[4747]: E1205 23:40:38.842216 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:40:52 crc kubenswrapper[4747]: I1205 23:40:52.839720 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:40:52 crc kubenswrapper[4747]: E1205 23:40:52.840472 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:41:07 crc kubenswrapper[4747]: I1205 23:41:07.840531 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:41:07 crc kubenswrapper[4747]: E1205 23:41:07.841399 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:41:19 crc kubenswrapper[4747]: I1205 23:41:19.857155 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:41:19 crc kubenswrapper[4747]: E1205 23:41:19.858748 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:41:30 crc kubenswrapper[4747]: I1205 23:41:30.839931 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:41:30 crc kubenswrapper[4747]: E1205 23:41:30.840734 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:41:44 crc kubenswrapper[4747]: I1205 23:41:44.840494 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:41:44 crc kubenswrapper[4747]: E1205 23:41:44.842710 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:41:58 crc kubenswrapper[4747]: I1205 23:41:58.843171 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:41:58 crc kubenswrapper[4747]: E1205 23:41:58.844719 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:42:10 crc kubenswrapper[4747]: I1205 23:42:10.841182 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:42:10 crc kubenswrapper[4747]: E1205 23:42:10.842088 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:42:21 crc kubenswrapper[4747]: I1205 23:42:21.840472 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:42:21 crc kubenswrapper[4747]: E1205 23:42:21.841533 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:42:36 crc kubenswrapper[4747]: I1205 23:42:36.840322 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:42:36 crc kubenswrapper[4747]: E1205 23:42:36.841013 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:42:51 crc kubenswrapper[4747]: I1205 23:42:51.840669 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:42:51 crc kubenswrapper[4747]: E1205 23:42:51.841419 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:43:04 crc kubenswrapper[4747]: I1205 23:43:04.841239 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:43:04 crc kubenswrapper[4747]: E1205 23:43:04.842200 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7lblw_openshift-machine-config-operator(85ba28a1-00e9-438e-9b47-6537f75121bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" podUID="85ba28a1-00e9-438e-9b47-6537f75121bb" Dec 05 23:43:16 crc kubenswrapper[4747]: I1205 23:43:16.840192 4747 scope.go:117] "RemoveContainer" containerID="c0be5f7d542c61c3260dfab5cd66694405c9b04ec8d32a5bbe63eb0767b07e9d" Dec 05 23:43:17 crc kubenswrapper[4747]: I1205 23:43:17.985416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7lblw" event={"ID":"85ba28a1-00e9-438e-9b47-6537f75121bb","Type":"ContainerStarted","Data":"c4d07247c2af7b1cf8f7f0ed53efbf9fcddff283cd6c7fc329dee5ef7cf4261f"} Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.560761 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:31 crc kubenswrapper[4747]: E1205 23:43:31.561890 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="extract-content" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.561909 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="extract-content" Dec 05 23:43:31 crc kubenswrapper[4747]: E1205 23:43:31.561930 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="copy" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.561937 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="copy" Dec 05 23:43:31 crc kubenswrapper[4747]: E1205 23:43:31.561968 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="extract-utilities" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.561977 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="extract-utilities" Dec 05 23:43:31 crc kubenswrapper[4747]: E1205 23:43:31.561991 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="gather" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.562000 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="gather" Dec 05 23:43:31 crc kubenswrapper[4747]: E1205 23:43:31.562022 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="registry-server" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.562030 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="registry-server" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.562284 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="gather" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.562308 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79a6703-5bd6-4197-afd2-94161eee1ef1" containerName="registry-server" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.562323 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ca8613-9c9c-49b1-ade8-e5928fceef4d" containerName="copy" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.564160 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.576914 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.660107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.660269 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75nw\" (UniqueName: \"kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.660352 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.762834 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.762995 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75nw\" (UniqueName: \"kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.763061 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.763641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.763655 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.793224 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75nw\" (UniqueName: \"kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw\") pod \"certified-operators-mhpxb\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:31 crc kubenswrapper[4747]: I1205 23:43:31.883099 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:32 crc kubenswrapper[4747]: W1205 23:43:32.375512 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb576af95_8cd2_45ca_afa8_d0f125adc253.slice/crio-4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858 WatchSource:0}: Error finding container 4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858: Status 404 returned error can't find the container with id 4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858 Dec 05 23:43:32 crc kubenswrapper[4747]: I1205 23:43:32.381042 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:33 crc kubenswrapper[4747]: I1205 23:43:33.143673 4747 generic.go:334] "Generic (PLEG): container finished" podID="b576af95-8cd2-45ca-afa8-d0f125adc253" containerID="5756c55f9ac53587b734af84123c4866f9caa0d429406f81b676a06178624279" exitCode=0 Dec 05 23:43:33 crc kubenswrapper[4747]: I1205 23:43:33.143715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerDied","Data":"5756c55f9ac53587b734af84123c4866f9caa0d429406f81b676a06178624279"} Dec 05 23:43:33 crc kubenswrapper[4747]: I1205 23:43:33.143940 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerStarted","Data":"4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858"} Dec 05 23:43:34 crc kubenswrapper[4747]: I1205 23:43:34.156726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerStarted","Data":"06a5364baa5ddb2a17042b8dcc11f1102bee39510cadbd156a9a3079fb756bb9"} Dec 05 23:43:35 crc kubenswrapper[4747]: I1205 23:43:35.173436 4747 generic.go:334] "Generic (PLEG): container finished" podID="b576af95-8cd2-45ca-afa8-d0f125adc253" containerID="06a5364baa5ddb2a17042b8dcc11f1102bee39510cadbd156a9a3079fb756bb9" exitCode=0 Dec 05 23:43:35 crc kubenswrapper[4747]: I1205 23:43:35.173553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerDied","Data":"06a5364baa5ddb2a17042b8dcc11f1102bee39510cadbd156a9a3079fb756bb9"} Dec 05 23:43:36 crc kubenswrapper[4747]: I1205 23:43:36.183778 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerStarted","Data":"2576e950aa7ce987efd0e149ddacc39e955d71b69fc8b49c81ba739b56f550b5"} Dec 05 23:43:36 crc kubenswrapper[4747]: I1205 23:43:36.210285 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhpxb" podStartSLOduration=2.807706239 podStartE2EDuration="5.210266046s" podCreationTimestamp="2025-12-05 23:43:31 +0000 UTC" firstStartedPulling="2025-12-05 23:43:33.159198969 +0000 UTC m=+10883.626506457" lastFinishedPulling="2025-12-05 23:43:35.561758766 +0000 UTC m=+10886.029066264" observedRunningTime="2025-12-05 23:43:36.205098888 +0000 UTC m=+10886.672406406" watchObservedRunningTime="2025-12-05 23:43:36.210266046 +0000 UTC m=+10886.677573554" Dec 05 23:43:41 crc kubenswrapper[4747]: I1205 23:43:41.884082 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:41 crc kubenswrapper[4747]: I1205 23:43:41.884651 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:41 crc kubenswrapper[4747]: I1205 23:43:41.948118 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:42 crc kubenswrapper[4747]: I1205 23:43:42.314042 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:42 crc kubenswrapper[4747]: I1205 23:43:42.363552 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:44 crc kubenswrapper[4747]: I1205 23:43:44.273880 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhpxb" podUID="b576af95-8cd2-45ca-afa8-d0f125adc253" containerName="registry-server" containerID="cri-o://2576e950aa7ce987efd0e149ddacc39e955d71b69fc8b49c81ba739b56f550b5" gracePeriod=2 Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.286502 4747 generic.go:334] "Generic (PLEG): container finished" podID="b576af95-8cd2-45ca-afa8-d0f125adc253" containerID="2576e950aa7ce987efd0e149ddacc39e955d71b69fc8b49c81ba739b56f550b5" exitCode=0 Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.286553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerDied","Data":"2576e950aa7ce987efd0e149ddacc39e955d71b69fc8b49c81ba739b56f550b5"} Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.286896 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhpxb" event={"ID":"b576af95-8cd2-45ca-afa8-d0f125adc253","Type":"ContainerDied","Data":"4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858"} Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.286913 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4660303f466d7931892e7c5307250e9daebe90117f04caea75ae70d13d601858" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.291431 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.390951 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content\") pod \"b576af95-8cd2-45ca-afa8-d0f125adc253\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.391022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities\") pod \"b576af95-8cd2-45ca-afa8-d0f125adc253\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.391148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75nw\" (UniqueName: \"kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw\") pod \"b576af95-8cd2-45ca-afa8-d0f125adc253\" (UID: \"b576af95-8cd2-45ca-afa8-d0f125adc253\") " Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.392328 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities" (OuterVolumeSpecName: "utilities") pod "b576af95-8cd2-45ca-afa8-d0f125adc253" (UID: "b576af95-8cd2-45ca-afa8-d0f125adc253"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.398957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw" (OuterVolumeSpecName: "kube-api-access-j75nw") pod "b576af95-8cd2-45ca-afa8-d0f125adc253" (UID: "b576af95-8cd2-45ca-afa8-d0f125adc253"). InnerVolumeSpecName "kube-api-access-j75nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.445132 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b576af95-8cd2-45ca-afa8-d0f125adc253" (UID: "b576af95-8cd2-45ca-afa8-d0f125adc253"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.493133 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.493161 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b576af95-8cd2-45ca-afa8-d0f125adc253-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:45 crc kubenswrapper[4747]: I1205 23:43:45.493171 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75nw\" (UniqueName: \"kubernetes.io/projected/b576af95-8cd2-45ca-afa8-d0f125adc253-kube-api-access-j75nw\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:46 crc kubenswrapper[4747]: I1205 23:43:46.297959 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhpxb" Dec 05 23:43:46 crc kubenswrapper[4747]: I1205 23:43:46.351956 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:46 crc kubenswrapper[4747]: I1205 23:43:46.373725 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhpxb"] Dec 05 23:43:47 crc kubenswrapper[4747]: I1205 23:43:47.853652 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b576af95-8cd2-45ca-afa8-d0f125adc253" path="/var/lib/kubelet/pods/b576af95-8cd2-45ca-afa8-d0f125adc253/volumes"